This action might not be possible to undo. Are you sure you want to continue?

BooksAudiobooksComicsSheet Music### Categories

### Categories

### Categories

Editors' Picks Books

Hand-picked favorites from

our editors

our editors

Editors' Picks Audiobooks

Hand-picked favorites from

our editors

our editors

Editors' Picks Comics

Hand-picked favorites from

our editors

our editors

Editors' Picks Sheet Music

Hand-picked favorites from

our editors

our editors

Top Books

What's trending, bestsellers,

award-winners & more

award-winners & more

Top Audiobooks

What's trending, bestsellers,

award-winners & more

award-winners & more

Top Comics

What's trending, bestsellers,

award-winners & more

award-winners & more

Top Sheet Music

What's trending, bestsellers,

award-winners & more

award-winners & more

Welcome to Scribd! Start your free trial and access books, documents and more.Find out more

**Daniel Sloan is an internationally recognized Six
**

Sigma Master Black Belt and an ASQ certified Black Belt. His 16 years of experience have been distinguished by Six Sigma seminars in Mexico, Uruguay, Brazil, Australia, and 47 of the United States. McGraw Hill and Quality Press published five of his 7 books. As a Senior Vice President of Applied Business Science for a $500 million company, he led their Six Sigma initiative. With "factory floor" Six Sigma successes ranging from non-woven fabrics, extruded products, medical equipment, aerospace engineering, automotive parts, to Internet router production and health care, Daniel has a proven track record in helping companies produce bottom line results.

Russell Boyles earned his PhD in Statistics at the

University of California, Davis. He subsequently spent two years in the Applied Mathematics Group at Lawrence Livermore National Laboratory, two years as Director of Statistical Analysis for NERCO Minerals Company, and eight years as Statistical Process Control Manager at Precision Castparts Corporation. As a trainer and a consultant, Russell specializes in Six Sigma Master Black Belt and Black Belt certification courses, Design of Experiments, Gage Studies, Reliability and Statistical Process Control. A few of his recent papers have appeared in ASQ publications Technometrics and Journal of Quality Technology.

**Evidence-based Decision Services and Products
**

We are the first and best provider of evidence-based decision services in the world. We help clients rapidly use the evidence in their raw data to dramatically improve bottom line business results.

**Six Sigma Services
**

Master Black Belt, Black Belt, Green Belt, Champion, and Senior Executive certification training for all industries including manufacturing, financial services, and health care. Consortium Six Sigma events for small companies who wish to pool resources. Custom designed training events and multi-media, evidence-based Six Sigma materials.

**Evidence-based Decision Support
**

Data mining, strategic Information Systems design. Bottom-line business results project coaching. Consulting support to private industry, government and academic institutions that are implementing evidence-based decision systems. Custom designed training events and multi-media, evidence-based education and training materials.

For more information visit or call:

Sloan Consulting, Seattle, WA (206)-525-7858

M. Daniel Sloan, author and owner of the copyright for this work, has licensed it under the Creative Commons Attribution Non-Commercial NonDerivative (by-nc-nd) License. http://www.danielsloan.com is the legal, file download location. To view this license visit:

http://creativecommons.org/licenses/by-nc-nd/2.5/ http://creativecommons.org/licenses/by-nc-nd/2.5/legalcode Or send a letter to: Corporate Headquarters Creative Commons 543 Howard Street 5th Floor San Francisco, CA 94105-3013 United States

Profit Signals

How Evidence-based Decisions Power Six Sigma Breakthroughs

By M. Daniel Sloan and Russell A. Boyles, PhD

Sloan Consulting, LLC Seattle, Washington

©

M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

Profit Signals, How Evidence-Based Decisions Power Six Sigma Breakthroughs. M. Daniel Sloan and Russell A. Boyles Library of Congress, Cataloging-in-Publication Data Sloan, M. Daniel, 1950Boyles, Russell A., 1951Profit signals how evidence-based decisions power six sigma breakthroughs / M. Daniel Sloan and Russell A. Boyles Included bibliographical references and index. 1. Six Sigma—quality control—Statistical Models. 2. Medical care—quality assurance— Statistical Models. 3 Cost Control—Statistical Models—Mathematical Models. © 2003 by Evidence-Based Decisions, Inc. http://www.evidence-based-decisions.com All rights reserved. No part of this book may be reproduced in any form or by any means, electronic, mechanical; photocopying, recording, or otherwise, without the prior written permission of the publisher. Your support of author’s rights is appreciated. For permissions the authors can be contacted directly. Sloan Consulting http://www.danielsloan.com/ 206-525-7858 10035 46th AVE NE, Seattle WA 98125

Trademark Acknowledgements Profit Signals® and the phrase “Vector Analysis Applied to a Data Matrix®” and the Profit Signals tetrahedron on the book’s cover are registered trademarks of Sloan Consulting. LLC, Six Sigma® is a registered trademark and service mark of Motorola, Incorporated. Sculpey Clay® is a registered trademark of Polyform Products Co. Excel® is a registered trademark of Microsoft. Other copyright notices are listed in the production notes at the end of the book. Illustrations: Cover, Robin Hing. Tables and illustrations, Robin Hing, Russell A. Boyles, M. Daniel Sloan, John Pendleton, Austin Sloan, and Alan Tomko. Netter illustrations used with permission from Icon Learning Systems, a division of MediMedia USA, Inc. All rights reserved. The book’s design and layout, using Adobe InDesign 2.0.2, were completed by M. Daniel Sloan. Printed in the United States of America.

©

M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

Many of the most useful designs are extremely simple. Ronald Alymer Fisher

How much variation should we leave to chance? Walter A. Shewhart

©

M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

2003 . All Rights Reserved.© M. Daniel Sloan and Russell A. Boyles.

18 How to Read This Book .........................24 Chapter 1—The Five-Minute PhD ....... 47 Chapter 2—Standards of Evidence .... 79 © M........................................ 65 Bar Chart Bamboozles ............ All Rights Reserved............................ 44 Closing Arguments .................... 30 Profit Signals ........................................................... 50 “Scientific” Management.................................................. 49 Poetry versus Science............................... 74 Spreadsheet versus Data Matrix... 51 Cost Accounting Variance Analysis ........................................................................................................... 37 Data Recycling....................................................................................... Boyles....... 56 Vector Analysis 101 ..... 67 The Game is Afoot.................................... 44 The New Management Equation ........... 46 Endnotes ................ 53 Accounting versus Science ............... 14 The Dollar Value of Evidence.......................... 55 Delusions and Bamboozles ............... 57 Degrees of Freedom.......................................................................................................... 27 Start Your Stopwatch Now ............... 16 Six Sigma ........... 9 The Parable of the Paper Bags .. 43 The Full Circle of Data Discovery.................................... 20 Endnotes .............Table of Contents Premise ........ 28 Business Art and Science ....................... Daniel Sloan and Russell A............................................ 2003 .....................................................................................................................

.................................................. All Rights Reserved........... 142 “Die Tuning” for Vinyl Extrusion .........................149 Chapter 5—Using Profit Signals ..................... 135 The Daily Grind ..................................................... 100 Compare and Contrast Analysis ....... 163 Comparing Three Ways of Doing Things ......................................vi Table of Contents P-values...........................................................................................94 Define.......................................167 Comparing Eight Ways of Doing Things .............................. 96 Lucrative Project Selection ............. 89 The Six Sigma Profit Strategy ................. 128 “Beating Heart” Bypass Grafts ........... 90 Lucrative Project Results Map ....... 98 Financial Modeling and Simulation .................................... Profit Signals..................................... Confidence Levels and Standards of Evidence .. 162 Comparing Two Ways of Doing Things.............. 106 The Costs of Poor Quality ...152 Corrugated Copters........119 Days in Accounts Receivable .. 122 Breaking the Time Barrier ......... 104 Process Maps ....................113 Endnotes ........................................... Daniel Sloan and Russell A................................................. 83 Endnotes .175 Endnotes ..................................... 87 Six Sigma (6σ) Basics ..................................169 Comparing 256 Ways of Doing Things......156 Overcoming Obstacles ......................................................... 84 Chapter 3—Evidence-based Six Sigma ................................. Control ..145 Endnotes ............................................................153 Testing the Current Way of Doing Things ....................................117 Customer Service – Governmental Agency.... Analyze.........................174 Closing Arguments ....... Measure......... 81 Closing Arguments .. Boyles..............175 © M......................110 Process Capability .............151 A Better Way to Look At Numbers ....115 Chapter 4—Case Studies ....172 Chapter Homework................. 2003 ............ Improve..

.................................... 177 Fingerprint Evidence ..214 Chapter 8 —The Three Rs ............................................................................ 199 Monitoring Practices and Profits ...................................................217 Six Sigma’s Hidden Factory .... 205 Taking Action ..........................................179 Prediction Practice ..... 2003 ...191 Endnotes ............. Boyles.....................214 Endnotes ........ 225 II........................................................................ 227 III..... 231 IV.................183 Predicting Real Flight Times.................... 222 Endnotes ..............................Table of Contents vii Chapter 6—Predicting Profits .......................................................... Inc...........211 Closing Arguments ...... Profit Signals Production Notes . Daniel Sloan and Russell A...............................193 Evaluating Practices and Profits............ Evidence-Based Decisions...................................................................191 Chapter 7—Sustaining Results ........... 252 Index . 194 Process Improvement Simulation............................ 225 I...................... Six Sigma Black Belt/ Expert 16 Class Curriculum Outline ...........255 © M.......................................................................................... Vector Analysis And Evidence-based Decisions .................................. 224 Appendices ..........178 Three Wishes .................................. Glossary of Terms: Data Matrix........... All Rights Reserved...........218 Our Proposal...................... The Business Bookshelf ............................ 188 Closing Arguments ..................

Boyles. 2003 . Daniel Sloan and Russell A.viii Table of Contents © M. All Rights Reserved.

The laws of motion are a Generalization. This book will show you how to turn measurements into evidence and evidence into profit. audacious and empirically true Generalization. Thus one word has two. Very few people know this. © M. Since 1920. Every person in every organization can use this tool to make more money. 2003 . Gravity is a physical constant of our universe. circumstance. fundamental job skill. The laws of Variation. The word ‘generalization’ usually denotes a thoughtless. we aim to transform the arcane mysteries of vector analysis into common knowledge. All Rights Reserved.2 Unlike a generalization. Boyles.Premise P rofit Signals is a guide for using evidence to make better. Daniel Sloan and Russell A. Measurements are. are a Generalization. opinions. more profitable business decisions. a Generalization delivers valid conclusions and accurate predictions. Sir Ronald Fisher first explained it at the beginning of the 1920s. Face value judgments. the chance fluctuations or Noise that attend every measurement.1 Evidence is the foundation of Profit Signals and vector analysis is the foundation for evidence. a Generalization is a verifiable law of the universe. the correct analysis has consisted of a vector analysis applied to a data matrix. gut feelings. By contrast. Vector analysis is a vast. and superstitions are not pathways to evidence. broad assertion with no basis in fact. opposite meanings. Measurements become evidence when they are analyzed correctly. suspicions. The laws of gravity are a Generalization. With this book. Vector analysis is a must-have.

Evidence-based decisions focus on the three vectors on the right side of the following vector analysis equation: Raw Data = Data Average + Profit Signal + Noise.® The construction process is fun and informative. Boyles. The six edges of the tetrahedron shown in Figure 1 represent the six different ways of combining the three vectors on the right side of the equation. Generalizations like statistical variation can be tested and validated through a process of experimentation. Vector analysis is the only way to identify profit signals. The keys to making better.” Figure 1 A vector analysis requires a minimum of three Generalized dimensions. 2003 . more profitable business decisions are (1) identifying and (2) interpreting the profit signals in your raw data. The vector analysis equation is much easier to understand when it is presented as a picture. A vector defines magnitude and direction. The cornerstone of evidence is even easier to grasp in its physical form. All Rights Reserved. A vector is best visualized as an arrow connecting one point in space to another. A vector is a set of numbers that is treated as a single entity. Daniel Sloan and Russell A. 2) a Profit Signal. Knowing how to find and © M. and analysis. and 3) Noise. An evidence-based decision evaluates three key vectors: 1) a Data Average. We call this stable geometric figure “the cornerstone of evidence. In Profit Signals you will learn how to build one with bamboo skewers and Sculpey Clay. observation. Just as you can measure gravity (Newton and the apple). Profit signals are the most important element in any data-driven business decision.10 Premise They are Law.

Profit Signals fills in those educational gaps. They are known today as costaccounting variance analysis. Waterhouse & Company. 4. As © M. 5 Since then Harrison’s accounting principles and procedures have become universally accepted.” A break-even point. Figure 2 illustrates traditional break-even point analysis. Boyles. It is inherently onedimensional. Charter Harrison in 1918. That was the year G. Technically speaking. Daniel Sloan and Russell A. Charter Harrison. published “Principles of a Cost System Based on Standards” in Industrial Engineering magazine. 3. Many do not understand the fundamental difference between a data matrix and a spreadsheet. All Rights Reserved. these differences are called predicted values. The lines cross at the “break-even point. “cost-accounting variance analysis” is inherently one-dimensional. It is based on the difference between the two lines—the differences between average expense and average income at various volumes. 2003 . 11 Relatively few people are aware of vector analysis and its universal relevance. The break-even school of thought has dominated business decisions since 1918. With every chapter you will understand more clearly what profit signals are.6 Figure 2 The break-even school of thought was founded by G. It assumes that average expenses and average income are perfect linear functions of volume. a London accountant employed by Price. Dollars ������� (Averaged Expenses) ������ ���� � �� �� aged �� er e) v (A com In ��������������� ����������������� ���������������� Product or Service Volume For example.Premise graph your profit signals are extraordinary money-making skills. You will soon know why they are invaluable.

In break-even analysis. a cost accounting variance analysis can “…decompose the total difference between planned and actual performance into elements that can be assigned to individual responsibility centers.7 According to one Top-10 business-school accounting text. three-dimensional vector analysis. Equally important. All Rights Reserved. Boyles. the accuracy of any predicted value depends on the length of the noise vector. Figure 3 The differences between average income and average expense form one vector in the set of six required for a complete vector analysis. cost-accounting variance analysis evolved as a collection of one-dimensional methods. these differences form the vector at the back of the tetrahedron. the strength of evidence supporting any conclusion depends on a ratio involving the profit signal and noise vectors. as it is called.12 Premise shown in Figure 3. it is impossible for any of them to produce a correct analysis. predicted values are used in isolation. 2003 . there was no cross-pollination between Harrison’s work in London and Sir Ronald Fisher’s simultaneous 1918 development of vector analysis in rural England. This F ratio.”8 © M. Instead. Daniel Sloan and Russell A. For example. Unfortunately. Modern textbooks teach cost-accounting variance analysis as a way to identify causes of profits and losses. In other accounting cases. Establishing performance standards and evaluating actual results in relation to them were important steps forward. It is but one of the six vectors required for a complete. compares the length of the profit signal vector to the length of the noise vector. the analysis is based solely on the raw data. Because the methods of cost-accounting variance analysis are inherently one-dimensional.

21st Century teaching methods and computer graphics place vector analysis in its rightful position in business decisionmaking. Cost-accounting variance analysis is just arithmetic. It is a set of onedimensional analysis methods incapable of distinguishing profit signals from noise. The vast majority of evidencebased decisions are based on this simple formula. 2003 . Few of us ever saw the cornerstone of evidence. In the past. we had to wade through volumes of bewildering algebra to analyze data. With the New Management Equation we determine the variance and there the analysis begins. Vector analysis is transparent. geometry. The president of a $500 million company put it this way: “In old-school cost accounting we determine the variance and there the analysis stops. We call it the New Management Equation. In other words. All Rights Reserved. Transparency is indispensable to evidence-based decisions. It hasn’t changed one whit since the day it was born during the 20th Century’s “scientific management” craze. Cost accounting variance analysis lacks transparency.Premise 13 This sounds like a vector analysis. it is a constricting artifact. Transparency implies the full disclosure of all elements.) This equation defines the right triangles comprising the cornerstone of evidence: The square of the long side of a right triangle equals the sum of the squares of the other two sides. © M. Like the whalebone corsets of that era. (If you now have a frown on your face. It creates an outward appearance of propriety while it conceals covert improprieties. We trust you will too. It is not. Few of us were able to take evidence to our bottom line: “How can I personally use vector analysis to solve my problems and make my business more profitable?” The one formula we will use in this book is the Pythagorean Theorem. Daniel Sloan and Russell A. Boyles. It suppresses five-sixths—83 percent—of the accounting and analysis information that is contained in raw data. c2 = a2 + b2 .” This vector analysis is represented by the forward right triangle in Figure 4 . It is a desirable accounting quality. you probably learned about this idea in your favorite high school class.

Boyles. This process reminds me of the way my grandfather ran his business. Daniel Sloan and Russell A. We will help you challenge it.14 Premise Figure 4 Cost accounting variance analysis ends with variations from standard values. Then we will ask that you vote in favor of full disclosure and transparency. the more you will understand how using just one of six possible vectors can misrepresent evidence and damage profitability. Therefore. and we both belong in this class. We use her parable of the paper bags in our Six Sigma decision courses. The differences between a vector analysis applied to a data matrix and a spreadsheet analysis applied to arbitrary clusters of numbers are irreconcilable. 2003 . One of our novice students shared a personal story on a first day of training. A vector analysis begins with variations around the data average. All Rights Reserved. “What you are teaching us is a new skill that is hard to grasp. Our indictment is harsh. I was very close to © M. The more you know about the cornerstone of evidence. Competitors. “I loved my grandparents very much. are human. all of us face the same challenges when we tackle the Six Sigma body of knowledge for evidence-based decisions. The variations vector is then broken up into two components: 1) the Profit Signal vector and 2) the Noise vector. The Parable of the Paper Bags We wrote Profit Signals for business leaders who are resolute competitors.

Cendant. he took the money and bought property here in Arlington. I came home and proudly told him I could teach him to multiply. Washington. “There were all kinds of transactions. With his brother’s family. 2003 . During the Great Depression. They had 10 acres of raspberries and a five-acre garden. I have always found math to be difficult. “His system worked. I have to work at it and I really would rather do something that comes easy. But. my grandfather didn’t know how to multiply. There were milk cows and always some beef. North Carolina in 1893.Premise 15 them. All Rights Reserved. I still do. Daniel Sloan and Russell A. Boyles. He couldn’t bring himself to believe in this new-fangled way of doing things. But. They sold produce. he preferred to work for himself. Coca Cola. He listened and he learned how to do a few simple problems correctly. Once a month he would arrange these bags in the living room. Grandpa bought and sold heifers. After he picked it up he went right back to using his paper bags. “One day. Rite © M. When the Federal government bought his parcel of land in North Carolina for an addition to Smokey Mountain National Park. But. there were 13 of them altogether. he kept all his receipts in different brown paper bags. “He originally worked for the Sauk Logging Company. He had to leave school in the second grade to go work on a tobacco farm. Instead. “I learned how to multiply in third grade. but gosh. So they packed up their car and traveled across country to Darrington. “My grandparents planted 80 cherry trees. the testimony of Arthur Andersen. I got pretty good at it by the fourth grade. He never trusted multiplication. By multiplying he wouldn’t have to spend so much time with his paper bags. He split cedar shakes in his spare time to supplement the family income. what he sacrificed. The break-even thinking of cost-accounting variance analysis works. I think it was in 1959.” There is no doubt about it. Enron. he and his wife couldn’t earn a living. My grandfather was born in Sylva. Then he would add up columns of numbers so he would know what to charge people.

do not be surprised if you observe anger. bargaining. averages. 2003 . Get to yes with your executives and those you lead so that your company is not using brown paper bags to compete against a more powerful. bar graphs and pie charts. Negotiate and get to “Yes” with your peers. decade after decade. As you and your colleagues work to master evidence-based decisions. As senior executives. They pore over monthly spreadsheet reports. Vector analysis theory and tools are to multiplication as multiplication is to addition. With the pressure to produce © M. and the vector analysis mind set. we propose that vector analysis. We ask you to critically evaluate this proposal. Boyles. Nevertheless. efficient. Though these ideas do not intimidate children. They watch production. year after year.9 That way of doing work is a vector analysis applied to a data matrix. the most valuable information a manager has remains buried in the spreadsheets. It would be a generalization of the non-mathematical. denial. It is further obscured by arithmetic totals. All Rights Reserved. The Dollar Value of Evidence The quality of a manager’s decisions and consequent actions determine profit and loss. they can threaten adults. we hope you will arrive at acceptance.16 Premise Aid. WorldCom. This cycle accompanies any and every substantive life change. Power and beauty are their strengths. Anticipate this roller coaster. and as consultants to senior executives. At the end. managers weigh their “evidence”. They are valuable time savers. and profitable way of doing work. and depression. are in fact important parts of business decision solutions. suggests that the way it works is costly. but also their Achilles heel. effective. Usually. Daniel Sloan and Russell A. Since large sums of money are often at risk. we have seen this process repeated month after month. nonscientific variety to claim that vector analysis is the solution to problems of this magnitude. differences. and other companies of former greatness.

Some of those decisions were good. Uruguay. A more typical result is 50:1. office location and furnishings. in Australia. it is easy to understand why many managers resort to an expedient device: appearances. © M. We know this from serving individual and corporate customers in virtually every industry. social networks. What actions should I take based on this evidence? 4. We must handle fear. clothing. New Zealand. Boyles. c) I don’t want to lose my job. and 44 of the United States. All Rights Reserved. It may help you to know that. Brazil. Because we are all human. England. these questions are accompanied by unsettling feelings and thoughts that nurture anxiety: a) I am comfortable with the way things are.Premise 17 profits. evidence and ROI may not be enough. 2003 . The privileges of position— title. Daniel Sloan and Russell A. Some were based on evidence. automobiles. The difference between these two numbers forecasts the initial Return on Investment (ROI) you can expect from reading this book. and how strong is it? 3. In most cases. Some were bad. and financial reward—can and often do persuade others that appearance is evidence. Evidence-based decisions call for a higher standard. certain questions ought to begin to perk:10 1. Whenever a manager looks at the numbers used to measure the performance of an organization. bottom line business results like these eventually break through the barriers of fear. What is the evidence in these numbers. We ask you to contrast the profit related to good decisions with the loss related to bad ones. Singapore. You are probably reading this book because you have made business decisions. others were not. Should I believe these numbers? 2. b) This new knowledge puts my previous decisions in a bad light. Mexico. What evidence will confirm that management actions produced the desired results? Because we are only human. in every case where our students have used the information we present. ROI is at least 10:1.

yet it is profound. the data matrix and vector analysis. rapid. Since 1986. conceived Six Sigma in 1986. Daniel Sloan and Russell A. The one. an engineer at Motorola. The demand for additional. We. Two fundamental Six Sigma concepts. Six Sigma has been based on two seemingly reasonable assumptions: © M. Six Sigma In today’s popular press. we all have a natural aversion to change. The greater the change is. Six Sigma made Profit Signals possible.11 The process of making an evidence-based decision is elegantly simple. two-. It is not new. and other experienced professionals in the field. competition are forcing all of us to improve. profitable solutions to even the most complex. more dramatic breakthroughs. have never been explained to anyone’s satisfaction in any previous publication. Experience. The iterative nature of the Six Sigma project cycle has taught us which parts of Six Sigma are essential.12 Bill Smith. Boyles. We return the favor by showing how to flex the evidence muscle without carrying the weight of bureaucracy. the greater our aversion.and n-dimensional profit signals waiting to be discovered in your raw data can provide practical. confounding and challenging business problems you face. Nevertheless. evidence and most of all. Six Sigma companies have made a conscious decision to conquer their reluctance. All Rights Reserved. It has demonstrated its ability to improve productivity and profitability in every industry. three. also have learned which parts are extraneous. evidence-based decisions are known as Six Sigma (6σ). This major step forward has produced trillions of dollars in profit. This is natural and good. Each breakthrough spurs demand for further. dramatic breakthroughs can be satisfied only if we trim fat from Six Sigma’s middle-aged spread. 2003 .18 Premise Our clients welcome the opportunity to improve on their current methods of analysis and decision-making.

It is impossible to teach Six Sigma theory to everyone in a company. Anyone can master what is called the Black Belt Body of Knowledge (BOK). The underlying principles of a data matrix and vector analysis are timeless style. Six Sigma is a very classy way to earn them. this process can be accomplished in 10 to 16 days. Today’s requirements for Six Sigma leadership are simply these: a) A passionate aptitude for pursuing the truth in a system. Boyles. The supporting evidence for this claim is overwhelming. It is well beyond any shadow of doubt. your enterprise must embrace and leverage the power of evidence-based decisions. Goods and services must be able to withstand the scrutiny of the free press. If your enterprise is to succeed. c) The ability to operate carefully chosen statistical software. Based on our experience. Daniel Sloan and Russell A.Premise 19 1. its products and services must exceed the great expectations of fickle customers. b) An understanding of the nature of a physical law or Generalization. and even an investigative Senate sub-committee. 2003 .000 years. Improved decisions can lead you and your company to Six Sigma profits. 2. Profits are always in fashion. For over 2. Six Sigma tools are too difficult for most people to use. the New Management Equation ( c2 = a2 + b2) has helped people make money from measurements. Anyone and everyone can learn this unifying theory. This is the path we take and the case we make in Profit Signals. You will immediately be able to use what you learn to make evidencebased decisions. If you expect your business to meet these objectives. and with the support of senior management leadership. You will learn fundamentals quickly. They can learn it quickly. Personal computers and software have changed the world. All Rights Reserved. © M. We have discovered these assumptions are no longer valid.

Your Five-Minute PhD grants you the power of vector analysis. gourmet ice cream. Breakthroughs in every one are driven by disciplined observation. movies. How to Read This Book You can speed read this book in about a week. service or financial process. Read the captions to these exhibits. The ideas. skim the illustrations. windows.20 Premise Pick a profitable 21st Century product. Any one will do. electricity. measurement. beer. and X-treme competition all share a common bond. Daniel Sloan and Russell A. complete the suggested experiments as you go. and the New Management Equation. Chapter 1: The Five-Minute PhD – The opening chapter lets you earn your PhD in evidence-based decisions. surgery. health care. computers. the recording of data in an orderly data matrix fashion. mainstream business thinking. pharmaceuticals. roller ball pens. Once you get a handle on profit signals. textiles. neighbors— even your old high school teachers. aviation. After this initial overview. improved business decisions. Feel free to collaborate on these with colleagues. Its principles run deep and far beyond rote. 2003 . you may want to read it again at a more leisurely pace. The qualities of almonds. fast food. “Closing arguments” at the end of each chapter summarize the key content. global navigation systems. electrocardiograms. or sport. analogies and activities are presented in a particular sequence for good reason. and analysis.13 We welcome you to the world of the data matrix. you will be able to systematically quantify and prioritize the effects of multiple factors on any manufacturing. agriculture. Boyles. All Rights Reserved. It takes only five minutes. So it is best to read the book front to back. music. service. skiing. Welcome to the universe of Profit Signals. To get the “big picture” quickly. Olympic gold medal speed skating blades. The cornerstone of evidence has stood the test of time. standards of evidence. vector analysis. routine. magnetic information media. telecommunication. oil. Call it vector power if you will. © M. family members. If possible. friends. scuba diving.

this chapter has all the basics you need to know.P. We review the traditional Six Sigma breakthrough project cycle: Define. All Rights Reserved. middle managers and line workers have signed affidavits and testified to the value of our work. project selection criteria.) In the early 1920s a genius named Ronald Fisher discovered how to apply the New Management Equation to identify profit signals in raw data and quantify strength of evidence. 2003 . Box. Daniel Sloan and Russell A. In this chapter we trace the history of the costaccounting variance analysis and show how to improve it with vector analysis. process maps and financial model graphs. 15 In this chapter. Measure.Premise Chapter 2: Standards of Evidence – We review the 21 distinction between story telling and evidence. Control (DMAIC). We cover organizational guidelines. Our results have been published in peer-reviewed textbooks. It is also sweeter to the ear. Chapter 4: Case Studies – We each have more than 20 years of consulting experience in the field of evidence-based decisions. Analyze. This chapter’s inside joke and secret handshake are that a Greek named Pythagoras invented the “New Management Equation” 2500 years ago. These include: • Improving the quality of state government customer services with a $525. endorsed one of our three-dimensional analysis books in 1997. © M. Boyles.14 Dr. a Fellow of the Royal Society and elected member of the Academy of Arts and Sciences. Fisher’s method is the international standard for quantitative analysis in all professions save two: accounting and business management. Improve. (The New Management Equation is easier to say and spell than Pythagorean Theorem. You will learn the difference between vector analysis applied to a data matrix and spreadsheet calculations applied to arbitrary clusters of numbers. It reviews the traditional Six Sigma tool set. CEOs. we tell a few of our favorite breakthrough project stories. Chapter 3: Evidence-based Six Sigma – If you are new to Six Sigma. George E.000 pay off.

22 Premise • Reducing the days in accounts receivable by 30 days with a 14-day project and a $425. Bottom line value for this company for each of the next three years is $1 million. and other important things. Is this something new and different? No. profits. Comparing two ways of doing things. Fortunately.000.000 bottom line impact. Boyles. Doubling the productivity in a vinyl extrusion process while reducing the product material costs by 50% in three months time. You will tackle the following challenges facing the Corrugated Copter Company: • • • • Establishing baseline performance metrics. Dramatically improving the patient outcomes in cardiovascular surgery while putting $1 million in additional profits on a hospital’s bottom line. inventory. costs. Chapter 6: Predicting Profits – Corrugated Copter managers want to be able to accurately predict flight times. Improving the operations of a hospital’s Emergency Department (ED) with a gross margin of $18 million for a 38. Tool grinding breakthroughs worth $900. • • • • Chapter 5: Using Profit Signals – This chapter presents the fundamentals of vector analysis with a few pages of reading and a physical model. © M. If they could improve the quality of their predictions. All Rights Reserved. Comparing 256 different ways of doing things. You will learn how to expose the profit signals in your own data and represent them with bamboo skewers and Sculpey Clay®. 2003 . they could confidently take better advantage of market dynamics. equaling a grand total of $3 million. it is just another way to use the New Management Equation. It is vector analysis applied to a data matrix. this management team is up to speed on regression analysis.2 percent gain over the prior year’s performance. Comparing three ways of doing things. Daniel Sloan and Russell A.

Are these new and different? No. They are just other ways of using the New Management Equation. 2003 .Premise 23 Chapter 7: Sustaining Results – At Corrugated Copter best business results always means earning the greatest revenue with the least expense. It is responsible stewardship. Daniel Sloan and Russell A. We trust this information will serve as your outline for future study. Boyles. the Six Sigma business initiative created new breakthroughs in quality. Appendices – Here you will find a glossary of Profit Signal terms that will help you learn the language of evidence-based decisions. All Rights Reserved. A team of Corrugated Copters leaders has proposed an education system that would render their Six Sigma bureaucracy obsolete. The team learns to monitor and perfect their production processes through process capability studies and process control charts. There is a complete bibliography of the essential evidence-based decision bookshelf. productivity and profitability. We have also included a Profit Signals Black Belt Curriculum and production information on the Six Sigma tools we used to write and produce this book. Chapter 8: The Three Rs – In its time. They are vector analysis applied to a data matrix. © M. Corrugated Copters now believes traditional Six Sigma organizational ideas are outdated. This is more than a politically correct platitude.

1987. Charter.. The Engineering Magazine. Boston. 5. and Kaplan. Accounting: Text and Cases. Fisher. R. Charter. 3 Harrison. and Reece. and Noreen. Getting to Yes. The Engineering Magazine. Eric W. © M. Page 431. Eighth Edition. G. 1997. 10th Edition. No. 10 Royall. Chapman & Hall. Homewood. 1918. James S. Irwin. Daniel Sloan and Russell A. 8 9 Fisher. Robert S.A. Statistical Evidence. 5. November. Life of a Scientist. William. John Wiley & Sons. 1981. 1989.A. 2003 .. New York. Volume LVI. Joan Fisher.24 Premise Endnotes Box. Cost Accounting to Aid Production – II. Standards and Standard Costs. Industrial Management. 2003. Penguin Books. Relevance Lost. H. Charter. New York. 5 6 Johnson. New York. and Ury. Richard. Fisher. Robert N. Page 941. Boyles. Harvard Business School Press. The Engineering Magazine. Standards and Standard Costs. Joan Fisher. 5. Life of a Scientist. Industrial Management. Garrison. 1918. 7 Anthony. A likelihood paradigm. G. All Rights Reserved. Industrial Management. Volume LVI. 1978. 4 Harrison. Ray H. Managerial Accounting. 2 Harrison. Boston. No. No. December. Thomas. McGraw-Hill Irwin. Negotiating Agreement Without Giving In. R. Volume LVI. 1918. New York. Cost Accounting to Aid Production – II. 1978. October. G. The Rise and Fall of Management Accounting. 1 Box. Standards and Standard Costs. Cost Accounting to Aid Production – I. John Wiley & Sons. Roger.

May 23. Daniel Sloan and Russell A.motorola. Volume xxxvii.pdf 13 14 Sloan. “Motorola’s Second Generation. http://mu. Success Stories on Lowering Health Care Costs by Improving Health Care Quality. 1958. Daniel and Torpey. Jodi B. ASQ Quality Press. M. Six Sigma is a registered trademark and service mark of Motorola Incorporated.Premise 11 25 Shewhart. Milwaukee. 1995. ASQ Quality Press. Using Designed Experiments to Shrink Health Care Costs. January. 15 © M. Nature and Origin of Standards of Quality. Page 8. Daniel. All Rights Reserved. 12 Six_Sigma.” Six Sigma Forum Magazine. M. For a summary overview please read: Barney. The Bell System Technical Journal. Sloan. Buckminster. Milwaukee. 2003 .com/pdfs/Mot_ Fuller. Martin’s Press. pages 13-16. 1997. number 1. Boyles. The Motorola web site is a recommended resource for researching this history of Six Sigma. Critical Path. Walter A. 2002. St. F. Matt. New York.

All Rights Reserved. 2003 .26 Premise © M. Boyles. Daniel Sloan and Russell A.

engineers. scientists. The Five-Minute PhD is a democratic degree that exemplifies our age. 2003 . We all acquire memories through the experience of our everyday lives. They can be disrespectful of bureaucracy and hierarchy. Knowledge and information challenge authority. What was once the high water mark of postgraduate study is now as simple as a Google web search. Solving one’s own problems saves time and money. mathematicians. economists. All Rights Reserved. professions and authority remain secure. Few are eager to debunk the presumption of specialized knowledge that justifies position and paycheck. It can be earned by anyone who is willing to work at it. They applaud the pointed question. medical doctors. Memories make life rich and rewarding. Daniel Sloan and Russell A. statisticians. managers. but they rarely bring innovation to our © M.Chapter 1 The FiveMinute PhD P hD’s. They reward the cross-examination of high priests and presidents. This is fun. people can and do effectively solve more of their own problems. Boyles. Memories can teach. Knowledge and its application are the taproots for professional stature and income. With knowledge and information. So long as information and knowledge remain shrouded by jargon. Anyone can learn to do a vector analysis. Companies that know how to solve problems quickly make more money than those that don’t. and executives don’t own the lock and key to data analysis. You don’t need a certificate on your wall to analyze data.

” Think of tea with two cubes of ice. in less than five minutes. This is called a 23 (two raised to the third power) experimental design. The high setting is coded +1. products are manufactured. medical treatments are rendered and services are delivered. Except for an occasional stroke of dumb luck. Daniel Sloan and Russell A. backpack weight bearing (y column).” establishes the order in which eight observations were made. The low setting is coded -1. Experimental data. we obtain new knowledge only by applying the basic disciplines of experimentation. Table 1 arrays the eight observations in an economical experiment. each factor was set at only two levels.28 Five-Minute PhD work places. For convenience. and analysis—work. 2003 . The factors1 in this three-dimensional experiment are gender (x column). All Rights Reserved. and is. observation. Start Your Stopwatch Now Raw data contain information. Boyles. and activity (z column). Companies that apply knowledge and intelligence make more money than those that depend on memories. Heartbeats is the measured response (dependent variable) in the experiment. “Two raised to the third power” is a mouthful. and the idea will be more refreshing. Neither previous experience nor training nor calculations are required. “two cubed. observation. and analysis. so it is usually pronounced. measurements. experiments must be sized with economy in mind. Good information leads to reliable predictions. turned into numbers. With the sole exception of pure mathematics. we get no new knowledge from casual experience. You will now learn how these three basic disciplines— experimentation. and prove that they do work. When time and money are valued. music is played. airplanes fly. In our digital world all information can be. © M. The first column in the array labeled “Experiments Called Runs. Table 1 contains all eight possible combinations of a three-factor experiment with two levels for each factor. all through the use of numbers. rather than an equation. come from disciplined observation by design. Telephones work.

) These © M. the resulting number of radial artery heartbeats were measured and recorded. (You can feel your own radial artery pulse by touching the inner aspect of one of your wrists with the index and middle fingers on the opposite hand. Figure 1 The ideal data matrix forms a cube. For each of the eight experiments. or runs. this array does in fact create a cube. Come back to this illustration after you complete your PhD.Five-Minute PhD 29 Table 1 The cube or “design of experiments” (DOE) array is an ideal data matrix. 2003 . As you can see in Figure 1. Boyles. All Rights Reserved. The clock is ticking. Daniel Sloan and Russell A.

you were able to correctly identify one main effect (activity). Daniel Sloan and Russell A. Business Art and Science By using a special kind of row and column array called a data matrix. the measurement 70 for Run #1 was for a sitting man who had no weight in his backpack.30 Five-Minute PhD measurements are arrayed in the far right column of this data matrix. 2003 . Go back. For example. one inert factor (gender) and a two-factor interactive effect (the combination of activity and backpack weight).” Please turn your attention to the pattern of the response heartbeat measurements in the far right column. doctorallevel vector analysis in less than five minutes. you are at the top of your class. continue. in addition. When you have answered all four questions in the list. you graduated Cum Laude. Take time to look at the evidence patterns in the data matrix. Your eyeball vector analysis of eight numbers was accurate. but not for sitting. © M. Are you finished? Check your watch. Consider the following questions: • • • • Which combination of variables produced the two highest heartbeats rates? Which combination produced the two lowest heartbeats rates? Which variable appears to have the least effect on heartbeats? How would you predict future outcomes from similar experiments? Please pause now and stop reading. If. If you concluded that aerobic exercise has the strongest effect on heartbeats. you concluded that gender doesn’t really make much of a difference when it comes to the number of heartbeats. We predicted that you could successfully complete a three-dimensional. We bet we were right. All Rights Reserved. Boyles. This is an example of “disciplined observation. you have earned your Five-Minute PhD! If you noticed that carrying a 50-pound weight in a backpack increases the number of heartbeats for aerobic activity. Each of the eight response measures is the output from a unique combination of the three factors.

Please keep your hands and arms inside the analysis rocket. This is the vast Generalization of the mathematical/scientific variety we mentioned. It ranks them by importance and determines the strength of evidence. So. The evidence we now have about our universe confirms that Fisher’s vision of ndimensional. you can simultaneously quantify and prioritize the effects of several process variables. Software automatically calculates the profit signals. The eight numbers in the far right hand column of the 23 data matrix in Table 1 actually form a single entity. inexpensive software effortlessly applies vector analysis to any data matrix. it is true. visualizing more than three dimensions is out of the question. We explain the details in Chapter 5. 2003 . business. Nowadays. With relatively small amounts of data framed in a data matrix. Imagine the possibilities. In other words. This entity is an eight-dimensional vector! You have now entered hyperspace. and backpack weight. This applies to any manufacturing. The hallmark of Ronald Fisher’s genius was his ability to visualize n dimensions. Fisher’s vector analysis is the elegant simplicity that underlies myriad. Boyles. You are absolutely right. All Rights Reserved. Then. hyperspace was correct. Hyperspace is not as easy to accept as a free ride on Disney’s Space Mountain. with the grace of a high technology thrill ride. © M. gender. Science-fiction writers use the mathematical term hyperspace when they need a word to describe faster-than-light travel. seemingly unrelated analysis techniques. fasten your seat belts.2 This was Imagineering at its very finest.Five-Minute PhD 31 Consider the economies of using this technique to solve business problems. Hyperspace is actually the mathematical term for a space with four or more dimensions. For most of us. Daniel Sloan and Russell A. Yes. These vectors are the basis for profit signals. software applications create threedimensional graphs annotated with accurate predictions. You succeeded in analyzing the three-dimensional experiment in Table 1 because you were able to visually compare the eight-dimensional vector for heartbeats to the eightdimensional vectors for activity. service or health care process.

Those who enjoy a stout beer now and again have been thankful ever since. You will discover this for yourself as you complete the exercises in this book. He applied these principles to solve difficult. You intuitively used data matrix and vector analysis principles to interpret the data in the Five-Minute PhD experiment. ������ ������ �� ����� ����� ��� ������ ������ Activity ������� � ����� ���� ������ ����� ������ ������� Repeated experiments at the setting of (+1. New vector analysis users are often amazed at the accuracy of predictions based on cubic and higher-dimensional experiments. Boyles. -1. right. Daniel Sloan and Russell A. economical sets of data. left. Working at the English Rothamsted Experimental Station. Aerobics) will produce an average heartbeat of about 188. -1) or (Male. front corner of the cube in Figure 2. +1) or (Female. This predicted value labels the lower. Ronald Fisher conducted the first cubic and higherdimensional experiments in 1919. back corner of the cube in Figure 2. important problems using small.32 Five-Minute PhD For example. Sitting) will produce an average heartbeat of about 71.75. Figure 2 Vector analysis applied to a data matrix gives analysts the power of three-dimensional graphics.25. 50-pound weight. used these statistics at the Guinness Brewery. the student of Fisher who first conceived the theory of statistical inference for small samples in 1907. The differences in appearance between this illustration and richer ones elsewhere in our book can be explained: the superior tables and illustrations were created using vectors. +1. No Weight. This predicted value labels the upper. All Rights Reserved. 2003 . © M. William Gosset. repeated experiments with the setting of (-1.

5 © M. All Rights Reserved. information technology. y-. The back “frontal” plane is the illustrated z-axis plane. Vector analysis is used to coordinate all commercial jet landings at Orlando International. drew a beautiful picture of a vector analysis (Figure 3). Boyles. This physiological phenomenon is called a spatial vectorcardiographic loop. It is practical. medicine. computing. 2003 . manufacturing. Daniel Sloan and Russell A. It is proven. and transportation support Fisher’s mathematical/ scientific Generalization. space technology. 3 It ought to be used to create financial statements.blood groups. The plus (+) and minus (–) signs for Rhesus blood groups symbolize vector analysis reference points. finance. It was used in 1908 by Willem Einthoven to create the electrocardiogram (EKG). The lower. Figure 3 The 23 cube you used to analyze heartbeats is identical to EKG theory and Rh+/Rh. communications. Vector analysis applies to everything. and z-axes are labeled using medical terminology. or side planes.4 Frank Netter. The x-axis refers to the sagittal. the Norman Rockwell of medical illustration. of the body. horizontal y-axis is illustrated while the upper y-axis plane is implied. It is used to describe Einstein’s special and general theories of relativity. In Netter’s drawing. biotechnology. the x-. It is used to graph voltage variations resulting from the depolarization and repolarization of the cardiac muscle.Five-Minute PhD 33 Eighty years of revolutionary advances in agriculture.

Consider the vast Generality. 2003 . and the enormous profit potential of this single tool. profitable applications of vector analysis to a 23 data matrix. measure.34 Five-Minute PhD The EKG made it possible to observe. the eight-dimensional vectors—especially the allimportant profit signals—will lead you directly to the most profitable solution. The only limitation is imagination. Revisit this illustration when you read the Six Sigma case study on “beating heart” Coronary Artery Bypass Graft (CABG) surgeries in Chapter 4. Boyles. So imagine. Take time to write down factors (inputs) and responses (outputs) that could help you make more money. Table 2 The cube experiment works for any process in any system. Table 2 lists a few of the thousands of proven. Knowledge produced by this Nobel Prize-winning achievement led to the creation of the most profitable niche in American medicine—cardiovascular care. © M. All Rights Reserved. The patterns that emerge from the EKG vector analysis are critical to the prediction of a beating heart’s behavior. Once you have performed the disciplined observations and recorded the measurements demanded by the cube’s data matrix. and graph the heart’s electrical impulses over time. Daniel Sloan and Russell A.

A “long. taxes. weak Noise vector means there is a statistically significant effect. Some of these are under your control. statistically significant effects and reliable predictions (Figure 5).Five-Minute PhD 35 Profit. A “short. All Rights Reserved. and sales volume—any response you can measure—depends on many factors and the interaction of those factors in your business. inventory turns. statistically insignificant effects and unreliable predictions (Figure 6). RAW DATA VAR IATI O Figure 6 A long. strong” profit signal vector and a “short. weak” noise vector indicate large. N DA TA AV E NOISE RA GE FIT RO NAL P G SI © M. productivity. The ratio of the length of the profit signal vector to the length of the noise vector quantifies the strength of evidence in the data. strong Noise vector and a short. Daniel Sloan and Russell A. strong” noise vector indicate small. strong Profit Signal with a short. Complexity is the rule. loss. The only way to distinguish profit signals from noise is to apply vector analysis to a data matrix. The variation is most likely due to Chance. time. weak” profit signal vector and a “long. some are not. weak Profit Signal indicate no statistically significant effect. not the exception. Figure 5 A long. Boyles. 2003 .

Trial and error is expensive.6. 2003 . and ineffective.36 Five-Minute PhD Data in the matrix must be obtained through a process of disciplined observation. © M. The cube is three-dimensional. timeconsuming. It has served as a keystone of professional knowledge and profitability since 1630 when Rene Descartes introduced the method for three-dimensional thinking. there is no reason to limit ourselves to three- Table 3 Multi-dimensional experiments improve profits in every industry. Daniel Sloan and Russell A. All Rights Reserved. crude. Boyles. it has three factors.7 Despite the fact that we inhabit a world of three physical dimensions. It is not a viable business strategy for the 21st Century.

statisticians.”8 In Picasso’s original Analytic Cubism. Boyles. it was used more as a method of visually laying out the FACTS…”9 Fisher explained his model and methods using virtually identical words. We are stuck with it. high school teachers. In cost accounting. Variance is a statistical measure based on the squared length of the variation vector. Pablo Picasso and George Braque created a new art form called Analytic Cubism. Picasso and Braque aimed at presenting data as perceived by the mind rather than the eye. multi-dimensional observations and vector analysis reduce the financial risk associated with every important business decision. 2003 . Business leaders must not excuse themselves from mastering this knowledge or the skills to go with it. To do so is to gamble the future of their companies on needlessly risky decisions. Fisher’s definition pre-dates the accounting definition by forty-some years. All Rights Reserved. To illustrate. “objects were deconstructed into their components…. An ANOVA © M. In Fisher’s terminology.Five-Minute PhD 37 dimensional experiments. “Every aspect of the whole subject is seen in a single dimension. Table 3 lists a few ndimensional experiments with n ranging from 2 to 6. He referred to vector analysis as the Analysis of Variance. a variance is a difference between an actual value and a standard or budgeted value. The analogies between Picasso’s and Fisher’s cubes are intriguing. We will discuss this further in Chapter 2. hands-on training. they can acquire this knowledge in just four days of accelerated. Daniel Sloan and Russell A. need to know how these tools work. Disciplined. Profit Signals Only a few years before Fisher used the cube and higherdimensional experiments to dramatically increase profitable crop yields in England. more than any other members of an organization. Typically. spreadsheets and statistical programs often employ the hideous acronym ANOVA for Analysis of Variance. Senior managers and corporate directors are knowledge workers. They. college professors. so we use it. Six Sigma Black Belts.

These factor combinations and measurements correspond to the eight corners of the cube. 2003 . here comes hyperspace again. In this sense. As a Generalization. a cube. Table 4 The coded version of the Five-Minute PhD cube experiment. For now. a cube experiment is three-dimensional. Table 4 contains a coded version of the Five-Minute PhD cube experiment. © M. these could be any measurements of interest to you. the three factors in a cube experiment form the edges of a three-dimensional solid. As we saw in Figure 1. Boyles.38 Five-Minute PhD “deconstructs” a data vector into the basic pieces essential for evidence-based decisions: Raw Data = Data Average + Profit Signal(s) + Noise We will discuss and illustrate various aspects of this vector equation in subsequent chapters. we focus on the component of greatest immediate interest. Daniel Sloan and Russell A. The response column contains eight measurements. However. In other words. All Rights Reserved. one for each combination. Y and Z variables. the data matrix for a cube experiment has eight combinations. The numbers –1 and +1 are traditionally used to designate low and high levels of each factor. the Profit Signal. Actual heartbeat data are used in the response column. Actual factor names are represented as generic X.

5 © M. We can use the cube to create three-dimensional representations of the eight-dimensional profit signal vectors in a cube experiment. Accepting this beyond-belief reality is easier said than done. . This is a case of a “long/strong” Profit Signal and “weak/short” Noise. 39 Acquiring the knowledge and mastering hyperspace analysis skills is well within everyone’s intellectual reach. genius-level model suggests. These planes correspond to the grouping of the eight response measurements shown in Table 5. It is beautiful art and art is the dream of a life of knowledge. These two opposing planes represent the eight-dimensional profit signal vector for the overall effect of factor Z.0 = 83. Activity. Consider what Fisher’s ingenious. Overall Z effect = (Average of 140. 2003 . This main effect is defined as the difference in the average response values for Z = -1 (Sitting) and Z = +1 (Aerobics).5 . 190) minus (Average of 70. Hyperspace—the real one rather than the realm of Luke Skywalker—is a very big idea. 136. Activity. opposing planes corresponding to the –1 and +1 levels of the most important factor Z. Figure 7 shows shaded. 180. For example. Daniel Sloan and Russell A. Boyles. All Rights Reserved. 86.Five-Minute PhD The corners of the cube give a three-dimensional representation of an eight-dimensional data vector. 88) = 161.10 Figure 7 The opposing planes represent the eight-dimensional profit signal vector for the overall effect of factor Z.78. 68.

the average response on the front plane of the cube would roughly equal the average response on the back plane. We reached these conclusions by comparing the column of response measurements in the data matrix to the columns representing the factors.40 Five-Minute PhD Table 5 This grouping of the eight response measurements corresponds to the opposing planes in Figure 7. Daniel Sloan and Russell A. © M. 2003 . This is not the case. Z had a large effect. In the Five-Minute PhD experiment. we must also quantify the strength of evidence for each profit signal. The average of the back plane is 83. This is a case of a “long/strong” Profit Signal and “weak/short” Noise. In actual practice. activity and backpack weight had noticeable effects on heartbeats. in Chapter 2. Boyles. All Rights Reserved. Table 5 Figure 8 Opposing planes representing the eight-dimensional profit signal vector for the main effect of factor Y. . If Z (activity) had no effect. displays in a spreadsheet format the two groups of measurements from the original Five Minute PhD experiment corresponding to the opposing planes in Figure 7. and the related concept of statistical significance. We will discuss this.5 heartbeats larger than the average of the front plane.

Figure 9 Opposing planes representing the eight-dimensional profit signal vector for the main effect of factor X. but they are perpendicular rather than parallel. one of the perpendicular planes in Figure 10 contains the four corners where X and Z have opposite signs (X × Z = -1). Daniel Sloan and Russell A. in the Five-Minute PhD experiment. All Rights Reserved. defined as the difference in the average response values for X × Z = -1 and X × Z = +1. These planes represent the eight-dimensional profit-signal vector for the interactive effect of X and Z. Boyles. 2003 . When the effect of one factor depends on the level of another. they will be more entertaining. they are said to have an interactive effect. The other plane contains the four corners where X and Z have the same sign (X × Z = +1). opposing planes representing the profit signal vectors for the main effects of factors Y and X. activity and backpack weight had an interactive effect. Increasing the weight affected the number of aerobic activity heartbeats. believe it or not. Pairs of planes on the cube can also represent interactive effects. © M. For example. just like they do when good guys finally win on the big screen. If you think about Star Wars while you mull over these images. For example. but not the number of sitting heartbeats. people do get up and cheer at the end of a multimillion-dollar breakthrough Six Sigma project. Plus.Five-Minute PhD Figures 8 41 and 9 show the shaded.

perpendicular planes representing the profit signal vector for the interactive effect of factors X and Y. Figure 12 shows the shaded. 2003 . If X and Z had a large interactive effect. perpendicular planes representing the profit signal vector for the interactive effect of factors Y and Z. All Rights Reserved. Figure 10 Perpendicular planes representing the eightdimensional profit signal vector for the interactive effect of factors X and Z. one plane would have a much larger average than the other. Boyles. © M. Daniel Sloan and Russell A.42 Five-Minute PhD If X and Z had no interactive effect. the average response on each plane would be the same. Figure 11 Figure 11 Perpendicular planes representing the eightdimensional profit signal vector for the interactive effect of factors X and Y. shows the shaded.

Eighty-three percent is not a typographical error. Boyles. Greismeyer at Centerville High. 2003 . Data Recycling The three-dimensional cube diagrams provide a looking glass into eight-dimensional hyperspace. For example. These familiar geometric models convey esthetic beauty and analytical power. have you noticed that all the corner values are used repeatedly? Every data point appears six different times. Alexander Calder. the greater the savings. da Vinci. and a Fellow of the Royal Society named Sir Ronald Fisher. It is simply a fact. Galileo. The larger the number of factors. They allow our threedimensional eyes to make some interesting observations. For most companies. once in each of the six profit signal vectors shown above! This is a lot of work for only eight little numbers to do. This data-recycling phenomenon is a characteristic of all cubic and higher-dimensional experiments. This bottom-line result is enhanced by the fact that vector analysis gives you the right answers to your most pressing business problems. This is not the 10th grade geometry taught by Mr. profit signals mean they can eliminate at least 83% of the data collection and storage costs incurred with primitive trial and error methods. Guglielmo Marconi. Daniel Sloan and Russell A. Orville and Wilbur Wright. This is the geometry of Michelangelo. © M. Einstein. All Rights Reserved.Five-Minute PhD 43 Figure 12 Perpendicular planes representing the eightdimensional profit signal vector for the interactive effect of factors Y and Z.

determines the strength of evidence and graphs the predicted values. engineering. Once data are entered into a data matrix.13 . quantitative evidence. In the 1980s it was repackaged as one of many Six Sigma tools. has produced huge financial returns in agriculture. It was the foundation of science at the turn of the 20th Century when Einstein created his special and general theories of relativity. An analogy illustrates similarities between ideas or things that are often thought to be dissimilar. Since 1935 the application of vector analysis to data matrices.16 In those days this tool set was called the Design of Experiments. To quote one of our past students. parables and old-fashioned story telling are the most effective tools for teaching people © M. “An analogy is like a comparison. a computer automatically calculates the profit signals. We encourage you to use analogies to accelerate your learning. It was the foundation of science in the 17th Century when Galileo proved that the earth revolved around the sun.14 . it is simply Profit Signals that come from the New Management Equation. 2003 . manufacturing. The New Management Equation Use your own imagination to conduct experiments to verify the insights you gained from your Five-Minute PhD. c2 = a2 + b2.12 . vector analysis has been the path to credible.” We have found that analogies. The data matrix and the New Management Equation. Daniel Sloan and Russell A.44 Five-Minute PhD The Full Circle of Data Discovery We are back to where we started in the Premise. Boyles. All Rights Reserved. form the backbone of vector analysis.11 It is the foundation of science at the turn of the 21st Century. health care and process industries.15. Vector analysis points to the entire family of common statistical distributions. better known to college students as ANOVA. In 2003. It even explains the results. Since 1920.

how can you use what you now know about a data matrix and vector analysis to save time and/or make more money? © M. 2003 . All Rights Reserved. Boyles. Yes. the principles of evidence-based decisions. This is yet another paradox in evidence-based decisions. This crowd-pleaser will let other doctors of philosophy know that you know what you are talking about. Specifically.Five-Minute PhD 45 Table 6 Like any true Generalizaion. friends. Have some fun practicing with your new PhD in universes of your own. feel free to throw around phrases like “Hegelian Dialectic” during your conversations. Since you are now a PhD. Daniel Sloan and Russell A. and colleagues at work. the cube experiment is a Law of the Universe. After you have completed your experiments with family. Table 6 lays out two proven favorites. discuss the implications of these analogies.

But. By using Euclid’s magical formula. Now if I were to speculate a bit. “Euclid. Z Cartesian coordinate system. I used the data matrix cube with a vector to suggest the passage of time in my 1916 best seller. I solved the mystery of specific gravity. one can transform the earth and the heavens into a vast design of intricate configurations. you made the impossible possible by the simplest of methods. Boyles.” Galileo Galilei: “Philosophy is written in this grand book the universe which stands continually open to our gaze. All Rights Reserved. and it characters are triangles. and if there were a computing machine 2500 years from now that ran a vector analysis program with a data matrix. I have found it.46 Five-Minute PhD Closing Arguments The following testimonies were transcribed in various historical hearings and trials about the Five-Minute PhD. c2 = a2 + b2. in the country there are two kinds of roads— the hard road for the common people and the easy road for the royal family.”18 © M. Relativity. But this book cannot be understood unless one first learns to comprehend the language and read the letters in which it is composed. Y. Archimedes: “Eureka. A Simple Explanation that Anyone Can Understand. By comparing the weights of solids with the weights of equal quantities of water. Daniel Sloan and Russell A. I have found it. all might be able to travel an easier road. circles. in geometry all must go the same way. and other geometric figures without which it is humanly impossible to understand a single word of it. But please isn’t there a shorter way of learning geometry than through your method?”17 Euclid: “Sire. It is written in the language of mathematics. There is no royal road for learning.” Albert Einstein: “The Gaussian coordinate system of Chance variation is a logical generalization of the X. 2003 . The Special and General Theory.

ac. He provided the means for expanding our organization. Rapid Interpretation of EKG’s. Volume 5.ibiblio.. I use the same Cartesian coordinate system to pilot my aircraft. It will continue to grow as long as there is imagination left in the world. 1 Box. Commissioned by CIBA. Page 4. Heart. R. Volume 5. 1969. 1996. the little fellow literally freed us of immediate worry. Dale. Joan. Tampa. Boyles. The CIBA Collection of Medical Illustration. Turrell uses light in his search for mankind’s place in the Universe.org/wm/paint/tl/20th/cubism. Commissioned by CIBA. and Z axes of light to achieve my objectives. 1978. All Rights Reserved.html © M. Life of a Scientist. 2003 .uk/~history/ Mathematicians/Descartes. 1969.pdf 6 http://www-gap.uk/pie/sadie/reprints/ perry_97b_greenwich. Edition V.A. John Wiley and Sons.dcs. Mr. 3 Netter.Five-Minute PhD 47 James Turrell is a hyperspace sculptor of international reputation.ac. Disneyland will never be completed.bbsrc. Fisher. 4 Dubin. He spelled production liberation for us. Frank. New York.”20 Endnotes These factors are also commonly known as independent variables. I use the X. Daniel Sloan and Russell A. Fisher. 2 Netter. James Turrell: “I want to create an atmosphere that can be consciously plumbed with seeing like the wordless thought that comes from looking in a fire. Heart. Cover Inc. The CIBA Collection of Medical Illustration.rothamsted. Y. Frank.st-and. 5 http://www.”19 Walt Disney: “I only hope that we never lose sight of one thing—that it all started with a mouse. Born of necessity.html 7 8 http://www.

Box. A. Page 90.” Metron 1: 3-32. Page 32.pbs. Hunter. Data Analysis and Model Building. 1978.org/art21/artists/turrell/ 18 19 http://goflorida. Relativity. A Clear Explanation that Anyone can Understand. 17 16 Thomas. 10: 507-521. Pages 4-5. Biometrika. “Frequency Distribution of the Values of the Correlation Coefficient in Samples from an Indefinitely Large Population”. The Special and General Theory.html 9 10 Inscription on the southern ceiling of the rotunda leading to a James Turrell Skyspace installation at the Henry Art Gallery on the University of Washington campus. 1952. Albert. All Rights Reserved. Boyles. New York: Hafner Publishing Company Inc. and Thomas.P. Thirteenth Edition.A. New York. Einstein. http://www. 1952. Statistical Methods for Research Workers.artchive. Einstein.A. 1967..com/artchive/P/picasso_ analyticalcubism. Crown Publishers. R. Henry. 1941. 1935. Daniel Sloan and Russell A. The Design of Experiments. A Simple Explanation that Anyone Can Understand.htm 20 © M. Relativity. New York: Hafner Press.about. Albert. Fisher. 11 Fisher. Ronald A. Hunter.com/library/bls/bl_wdw_ waltdisney_quotes. Dana Lee. 1921. 12 Fisher.48 Five-Minute PhD http://www. Garden City. William G. George E. R. New York. Living Biographies of Great Scientists. 1915. John Wiley and Sons. Stuart. Crown Publishers. 15 14 13 Fisher. Statistics for Experimenters: An Introduction to Design. R. The Special and General Theory. New York.. “On the Probable Error of a Coefficient of Correlation Deduced from a Small Sample. J. Garden City Books. 2003 .

China. must find ways to reduce production and delivery costs by at least 30 percent. Too often the answer is an uncomfortable silence or. But evidence provides an operational basis for making decisions only if we have standards by which to judge the strength of evidence. 2001 a good portion of our world-famous gridlock traffic jams vanished along with more than 30. they are competing head-to-head with managers in developing nations like Brazil. Boyles. “We’ve never asked ourselves that question before. For many old-school managers. and Malaysia—take your pick. Washington. with an atmospheric vacuum of evidence standards. For the first time in history. staying inside their corporate cultural comfort zone.Chapter 2 Standards of Evidence W hat are the objective standards of evidence your business uses to make decisions? We ask all new clients this question. Since September 11. Mexico. All Rights Reserved. including profitability. Demands for improved financial performance put oldschool managers in a bind. Many must achieve this within the next five years or go out of business. 2003 . is more important than achieving any business goal. If you doubt this possibility. more profitable business decisions. large and small. visit Seattle.” Evidence is the foundation for making better. Daniel Sloan and Russell A. This is a counter© M. North American businesses. As a result. Labor is a tiny fraction of the total cost of doing business in these newly emerging competitive economies. India.000 jobs.

50

Standards of Evidence productive and, given the painfully apparent need for jobs, a socially irresponsible attitude. Comfortable or not, spirited capitalism has put evidencebased decisions on the map. Whether they know it or not, vector analysis and standards of evidence are now on every manager’s radar screen. The only question is who will recognize and respond to the signals. Poetry versus Science Efforts to understand the world we live in began with story telling. Stories thrive in many forms, oral and written, poetry and prose. Stories convey values. They define and maintain cultures, including corporate cultures. Stories evoke fear, hope, joy, anger, sympathy, humility, respect, wonder and awe. Stories build like pearls around grains of historical fact. They tell us much, mostly about ourselves. Stories are not laws. They do not, and are not intended to, reliably describe historical facts or physical realities. Story telling does have its place, but it can be at odds with science. Story telling often involves tales of trial and error. Scientific discoveries inspire as much wonder and awe as any Paul Bunyan tale. But, the driving force behind science is disciplined observation, experimentation and analysis. The scientific method, which can be equated with Six Sigma, embraces affirmative skepticism. This skepticism is diametrically opposed to the innocence of credulity. Credulity, or as some prefer to say naïveté, is the suspension of critical thinking. Credulity allows us to experience the emotional impact of a good story. Credulity makes Disneyland, Disneyworld and the Epcot Center fun. The tension between story telling and science dates to poet John Keats’ criticism of scientist Isaac Newton’s prism. Newton discovered that “white light” contains an invisible spectrum of colored light. He made this spectrum visible by shining ordinary light through a prism. The process Newton used is called refraction. Refraction comes from a Latin word which means to break up.1 If you have ever had an eye exam for corrective lenses, your ophthalmologist or optometrist used refraction to determine your prescription.

©

M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

Standards of Evidence

51

Newton used his prism to create rainbows (Figure 1). Keats was appalled. Newton ruined rainbows by explaining them.2 Glasses, contact lenses, fiber-optic cables, lasers, big-screen TV and digital cameras work because Newton stuck to his intellectual guns. We are glad he did. The process of refracting white light into a visible spectrum of colors is a form of vector analysis. We are not being overly lyrical when we say that Ronald Fisher’s vector analysis “refracts” data. Refraction makes profit signals visible. This is essentially what you did to earn your Five-Minute PhD.

Figure 1 A diamond sparkles with colorful data vectors refracted from ordinary light.

For poets, this perspective is unwelcome. They are not alone in this feeling. Again and again, we hear Keats’ critique of Newton echoed in the protests of old-school managers who reject profit signals as well as the process of disciplined observation, experimentation and analysis. “Scientific” Management Managing any business is a challenge. Complexity arises from materials, work methods, machinery, products, communication systems, customer requirements, social interactions, cultures and languages. The first step in solving

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

52

Standards of Evidence complex business problems is to frame them in terms of a manageable number of key variables. Bottom-line profitability is the ultimate objective, but other metrics must also be considered. Sales, earnings per share, cost and time to develop and market new products, operating costs, inventory turnover, capital investments and days in accounts receivable are just a few. Profit signals from one or more of these variables often demand timely, reasoned responses. Frederick W. Taylor mesmerized the business community of his day with the 1911 publication of The Principles of Scientific Management. Taylor aimed to explain how any business problem could be solved “scientifically.” As an engineer for a steel company, Taylor had conducted a 26-year sequence of “experiments” to determine the best way of performing each operation. He studied 12 factors, encompassing materials, tools and work sequence. He summarized this massive investigation with a series of multifactor predictive equations. This certainly sounds like science. Unfortunately, trying to solve complex business problems with Taylor’s methods is akin to surfing the Internet with a rotary phone. In his 1954 classic How to Lie with Statistics, Darrel Huff characterized Taylor-style science as follows: “If you can’t prove what you want to prove, demonstrate something else and pretend that they are the same thing.”3 Taylor studied his 12 factors one at a time, holding the other 11 constant in each case. 4 This invalidates his multifactor equations. One-factor-at-a-time experiments are so thoroughly discredited, that they have their own acronym, OFAT. It is physically impossible for OFAT experiments to characterize multi-factor processes. OFAT experiments are also notoriously time consuming. This is probably why it took Taylor 26 years to complete his study.

©

M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

Standards of Evidence Cost Accounting Variance Analysis

53

Businessmen in the early twentieth century enjoyed comparing themselves to Einstein, Marconi, Edison, the Wright Brothers and other celebrity scientists of the day. G. Charter Harrison, an accountant with Price, Waterhouse and Company in London, chose Taylor as the celebrity “scientist” he wanted to emulate. Harrison published a series of articles in 1918 in support of his assertion that, “The present generally accepted methods of cost accounting are in as retarded a state of development as were those of manufacturing previous to the introduction by Frederick W. Taylor of the idea of scientific management.” A tidal wave of popularity was carrying Taylor’s book to best seller status. Harrison rode this wave. He advanced “scientific” principles for cost accounting. He proposed that “standard costs” be established for various tasks, and that actual costs be analyzed as deviations from the standard costs. This was an advance over previous methods. Harrison went on to describe an assortment of things that could be calculated from such differences, including “productivity ratios.” A 1964 Times Review of Industry article first used the term variance to describe Harrison’s difference between actual and standard costs.5 Perhaps old-school accountants and managers thought “variance” sounded more scientific than “difference.” They had good reason to do so. By 1964 Ronald Fisher’s vector analysis solution to a wide variety of statistical problems were widely known under his general term for them, Analysis of Variance. Analysis of Variance is the international gold standard for quantitative work in virtually every profession. Prior to the invention of Six Sigma in 1986, two notable professions were the only exceptions to this rule: accounting and business management. By 1978, business journalists were using the phrase “variance analysis” to refer to the examination of differences between planned and actual performance. The expression persists in today’s accounting textbooks: “The act of computing and interpreting variances is called variance analysis.”6

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

54

Standards of Evidence Needless to say, the cost accounting variance analysis of 1978 bore no relation to the Analysis of Variance invented by Fisher some 58 years earlier. The elements of a standard cost accounting variance analysis are shown in Table 1.

�

�

�

� � � �

� �

� �

� �

�

�

�

�

�

�

�

� �

� �

� � �

�

�

�

� �

�

� �

�

�

Table 1 Cost-accounting variance report formats vary. The key element is a column labeled “Variance Ratio.” It is the signed difference between an actual value and a standard, budgeted or forecast value, expressed as a percentage.7, 8, 9

�

� �

� � �

� � �

�

� �

� � �

� � �

�

� �

� � �

� � �

�

� �

� � �

� � �

�

� �

� � �

� � �

�

� �

� � �

� � �

�

� �

� � �

� � �

�

� �

� � �

� � �

� �

�

It is unfortunate that the word “variance” was redefined in 1964 to mean a difference between actual and standard values. There is nothing inherently wrong with analyzing such differences. In fact, it is a good idea. The problem comes in the type of “analysis” that is done with such differences, and the actions “variance analysis” conclusions can lead to. For example, the manager who is responsible for the $1,000 revenue “variance” in Table 1 will be asked to explain himself or herself. After all, the result is 20% under forecast! The explaining of this unacceptable negative variance occurs at the monthly meeting of the Executive Committee. This monthly ritual creates tremendous pressure to conform. It subverts critical thinking. Managers are forced to develop story-telling skills. A plausible explanation is produced. The manager vows not to let this bad thing happen again. After a month or two or three, the offending “variance” happens again. A plausible explanation is produced. The manager swears never to let this new bad thing happen again. And so on. The highest-paid employees in the company waste hours, days and even weeks every month, grilling each other over G. Charter Harrison’s 1918 productivity ratios. Objective

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

She reasoned. Accounting versus Science Today’s Generally Accepted Accounting Principles (GAAP) are loose guidelines. Unlike principles of physics. By contrast. Little has changed from those submitted to the United States government by a committee of Certified Public Accountants in 1932. The assistant dean of a leading business graduate school recently told us that her university continued to teach a core curriculum subject—cost-accounting variance analysis—that she personally knew to be false. judged the accounting profession to be functioning 70 years behind the times. “Businesses in this region hire graduates who know how to use costaccounting variance analysis. 2003 . a body of evidence becomes so compelling it becomes a new Generalization or Law. Empirical laws of science are forced to evolve. Daniel Sloan and Russell A. Over time. The authors of the book Relevance Lost: The Rise and Fall of Management Accounting. winner of the 1989 American Accounting Association’s award for Notable Contribution to Management Accounting Literature. It is every company’s greatest obstacle to evidence-based decisions and Six Sigma breakthroughs. The monthly cross-examination over variance analysis rather than Analysis of Variance (ANOVA) is an indefensible waste of time and money. experimentation and analysis leads to improvements. chemistry. that makes it 84 years behind the times! How could this happen? Perhaps it is a function of what the customers want.10 In 2003. Occasionally. Boyles. All Rights Reserved.” Another.Standards of Evidence standards of evidence are nowhere to be found in their discussions. in the words of a Harvard Graduate School of Business Administration textbook. New laws force old ones to be revised or scrapped. 55 The Executive Committee may as well try to produce rainbows without a prism. and © M. “Accounting principles are man-made. an inexorable process of disciplined observation. more revealing explanation runs deeper.

They have no objective standards of validity. © M. Until one-dimensional.”11 In other words. or are. We find that whole communities suddenly fix their minds upon one object. All Rights Reserved. like individuals. Delusions and Bamboozles In his 1841 classic Memoirs of Extraordinarily Popular Delusions. and run after it. GAAP will remain wide enough to drive a Six Sigma tractor-trailer rig through. accounting principles were not deduced from basic axioms. “Activity-based Cost Accounting”. when they care not what they do. “Total Quality Management”. “Zero-Based Budgeting”. we find that. particularly bad in and of themselves. cost-accounting arithmetic is upgraded to vector analysis applied to a data matrix. “Re-Engineering”. that millions of people become simultaneously impressed with one delusion. their seasons of excitement and recklessness. Daniel Sloan and Russell A. He could well have been writing about 20th and 21st Century popular culture when he penned the chapter “The Love of the Marvelous and Disbelief of the True. and go mad in its pursuit. experimentation and analysis to force improvements.” “In reading the history of nations. They simply lack the firm foundation and objective standards of evidence sound theory provides. “Management by Objective” and “Balanced Scorecards” are a few of the greatest hits. 2003 . they have their whims and their peculiarities. “Zero Defects”. None of these well-intentioned initiatives were. till their attention is caught by some new folly more captivating than the first. nor can they be verified by observation and experimentation.” 13 This passage is eerily familiar to those of us who have watched businesses become captivated by one management fad after another: “Excellence”.12 Charles Mackay described massive losses related to business practices just like today’s cost-accounting variance analysis.56 Standards of Evidence the other natural sciences. There is no process of disciplined observation. Boyles. cost accounting principles cannot be tested for validity.

So the old bamboozles tend to persist as new ones rise. according to his prevailing emotions. lived in a world of pure expression. Vector Analysis 101 John Keats. If he were an accountant today. he would demand a certain freedom of expression. It’s simply too painful to acknowledge.” 15 All business leaders—plant managers. We’re no longer interested in finding out the truth. a kind of critical mass of delusion is established. All Rights Reserved. billionaire CEOs—face this dilemma when they try to bring evidencebased decisions into their organizations. the stockholders.14 Carl Sagan explained it this way: “One of the saddest lessons in history is this: If we have been bamboozled long enough. Generally Accepted Accounting Principles place no premium on truth or even facts. transparent. you almost never get it back. He would assign to his analysis whatever weight of evidence felt right. The bamboozle has captured us. Boyles. He would analyze data however he wanted. Daniel Sloan and Russell A. cost-accounting variance analysis. He would present his data in free verse or any other format he desired. Eventually. or the employees of your company.Standards of Evidence 57 Occasionally. quantitative standards of evidence. They prize only internal consistency. phrenology. Once you give a charlatan power over you. the path of least resistance is to keep on bamboozling. he would be granted all these freedoms today without so much as a raised eyebrow. we tend to reject any evidence of the bamboozle. Our proposal? Replace cost-accounting variance analysis with a reliable. even to ourselves that we have been taken. a superstition. take your pick—manages to survive for a few years or decades. Once you have bamboozled the public. That proven method is profit signal vector analysis applied to a data matrix. fad or fallacy—astrology. homeopathy. 2003 . According to a 1990s Professor Emeritus at the Harvard Graduate School of © M. a poet of the Romantic Period. Amazingly. doctors. proven method of analysis based on objective. The capacity for critical thinking erodes.

This was 46 years before its debut in the Times Review of Industry. Pythagoras’ Theorem. graphics. The cells can contain text.) Just like Keats. What better way to spice up the workday? The situation is quite different for a Law or a Generalization like Fisher’s Analysis of Variance. Evidence-based decision companies repeatedly demonstrate why this is a profitable choice.17 It is an understatement to say the two definitions are significantly different. Fisher coined the term Variance in 1918. multi-variable calculus and vector analysis. There are no laws for arranging or analyzing data. plane and solid geometry and trigonometry. 2003 . They are as different as Generalization and © M.”16 The lack of any prescribed criteria for financial analysis explains why spreadsheets are so popular. There are no rules governing the interpretation of rows and columns. Managers would do well to follow his lead. symbols or formulas. Boyles. right triangles. Daniel Sloan and Russell A. In sharp contrast with the artificial guidelines of cost accounting. most people like being free to arrange their data in any way that suits their fancy. with statistical software they can do so immediately at virtually no cost. All Rights Reserved. Table 2 A spreadsheet consists of rows and columns. (See Table 2. � � �� � � �� � � �� � � �� � � ������ � ������ � ������ � �� � ������ � ������ � ������ � �� � ������ � ������ � ������ � Ironically.58 Standards of Evidence Business Administration. Fisher’s work was grounded in rigorous mathematics and physical reality: Cartesian coordinates. “There are no prescribed criteria [for variance analysis] beyond the general rule that any technique should provide information worth more than the costs involved in developing it. numbers. Fisher developed the Analysis of Variance in a farm near London right around the same time Taylor and Harrison were promoting their “scientific” management and costaccounting principles. Unlike Taylor and Harrison. Fisher actually was a scientist.

Boyles. Daniel Sloan and Russell A. at age 29. “It is the difference between lightning and a lightning bug.19 There were measurements recorded in rows and columns. Like a spreadsheet. To quote Mark Twain. The rows of a data matrix represent records—the individual objects or events on which we have data. Fisher knew exactly how to help his boss achieve these objectives: apply a vector analysis to a data matrix. The boss wanted to increase the annual yield in bushels of wheat. It is based on the length of the variation vector.” 59 Instead of a difference between an actual value and a standard value. The columns represent fields—the variables measured or inspected for each object.Standards of Evidence generalization. 2003 . Fisher was hired to “examine data and elicit further information” from his employer’s database. Fisher’s work defines today’s international standard for analyzing components of variation. According to his employer. he was in fact a genius who must be retained. “It took me a very short time to realize that he was more than a man of great ability. Each object is represented by a particular coordinate or position in the vector. There are two variables measured on two objects. The number of rows is called the sample size. Like most people.”18 Fisher’s job was to re-evaluate a business report identical to the ones managers use for decisions today. All Rights Reserved. There the similarity ends. Each column in a data matrix contains the measurements which are the data vector for the variable associated with the column. © M. Table 3 shows a simple data matrix. In 1919. a data matrix consists of rows and columns. Fisher’s boss subtracted average annual production numbers from each other to “determine” which years were most productive. Fisher’s Variance measures the degree of variability of a set of values around their average. he wanted to make more money while working shorter hours and using fewer resources. Fisher called his method “Analysis of Variance” because its purpose is to break up the variation vector into profit signal and noise components.

These vectors are plotted in Figure 2 . it is there. Boyles. 4) for Variable 1 and (5. The two coordinate axes correspond to the two objects. The rows of a data matrix represent records—the individual objects or events we have data on.60 Standards of Evidence Table 3 Like a spreadsheet. a data matrix consists of rows and columns. It is real. � � �������������� � � �������������� � � ��������� ����������� ����� � � �� � �� � ��������� ����������� ����� � � ��� � ��� � For example. In general. 2003 . hyperspace will remain forever beyond our three-dimensional vision. Nevertheless. Fisher’s innovation was to think of the data matrix in a geometric framework. vectors are n-dimensional. the data vectors in Table 3 are (3. 2) for Variable 2. The columns represent fields—the variables for which we have data. Like the inside of a black hole. plotted as vectors. © M. Evidence-based decision companies use hyperspace to make more money in less time while using fewer resources. where n is the sample size. We are back into hyperspace. Each stack of numbers in a data matrix column is a vector. In this example the vectors are two-dimensional because there are two objects in the data matrix. Figure 2 The two columns of Table 3. Daniel Sloan and Russell A. All Rights Reserved.

It is not a coincidence that 3. The closest constant vector is always the vector of averages. the closest point on this line is (3. 2003 . 0. For our data vectors.5. 3. 2) have the same average. 4) is closer to the vector of averages than (5. how do these vectors differ? Well. (3. This is no arbitrary accounting rule. 1). Figure 4’s center vector masks a long segment of this dotted line.5). but we did this on purpose.5 is the average of 5 and 2. moving from the lower left point of origin to the upper right. It is a law.5). The dotted line in Figure 4 locates the set of all possible twodimensional constant vectors. The first step in a vector analysis is to find the constant vector closest to the data vector. a mathematical/scientific Generalization. (2. Daniel Sloan and Russell A. D1 and D2. All Rights Reserved. 4) and (5. Only a portion of the dotted line is visible at the upper right hand portion of the illustration. It does seem coincidental that (3. It is not a coincidence that 3. it is a property of the physical universe. 2) is (Figure 4). A data vector close to its vector of averages has less variability than a data vector far from its vector of averages. 2) and (0. Examples of two-dimensional constant vectors are (1.Standards of Evidence Figure 3 61 illustrates the first basic rule of vector analysis: The shortest distance between a point and a line is along a path perpendicular to the line.5 is the average of 3 and 4. This means Variable 1 has less variability © M. Boyles. Figure 3 The shortest distance between a point and a line is along a path perpendicular to the line. So.5.

shown here in bold. Figure 5 identifies the variation vectors. How do we calculate the length of a vector? For this we need the second basic rule of vector analysis: The New © M. Figure 5 A is the vector of averages for both Variables 1 and 2. V1 and V2 are the corresponding variation vectors.62 Standards of Evidence than Variable 2. V1 and V2 . for Variables 1 and 2. Boyles. This “eyeball” analysis is just for illustration. The constant vector closest to any data vector is the vector of averages. All Rights Reserved. The length of the variation vector is directly related to the degree of variability in the data vector. 2003 . You can also tell this just by looking at the numbers in Table 3. it is not recommended for your real data sets. Figure 4 The dotted line is the set of all constant vectors. Daniel Sloan and Russell A.

Boyles. it is a property of the physical universe. 2003 . (c2 = a2 + b2) Once again. Using the letters in Figure 5. Only the alphabetic notation differs from the New Management Equation.a. the squared length of the data average vector A is: © M. Daniel Sloan and Russell A.k. Financial Engineering News was founded in 1997 to disseminate case studies. Now we can figure out the lengths of the variation vectors in Figure 5. In Figure 6 we use the New Management Equation to calculate the lengths of data vectors D1 and D2. The New Management Equation is so well known in professional financial and investment analysis circles that a bi-monthly newspaper. 63 The square of the length of the long side of a right triangle is equal to the sum of the squares of the lengths of the other two sides. All Rights Reserved. the Pythagorean Theorem). this is no arbitrary accounting rule.Standards of Evidence Management Equation (a. Figure 6 The length of a vector is the square root of the sum of the squares of its coordinates. Also. the New Management Equation for Variable 1 is: (D1)2 = A2 + (V1)2 We can see in Figure 6 that (D1)2 = 25.

is called the sample standard deviation for Variable 1.392 = 4. We can now plug these into the New Management Equation: 5.39.05 = 24. is often casually referred to as “sigma” or σ. which equals 4.5 + (V2)2 4.5 We can now plug these two numbers. The Greek letter sigma (σ) refers to the standard deviation of a population. Boyles. A sample standard deviation is symbolized in technical writing by the letter s. This substitution is a grievous breach of statistical theory.95. s.71.71.64 Standards of Evidence A2 = 3.52 = 24. the length of the variation vector for Variable 1.55 = 2. © M.5 + (V1)2 25 .5 = (V1)2 V1 = square root of 0.5. In Six Sigma practice.24. This final number.55 = (V2)2 V2 = square root of 4.52 + 3. 0. All Rights Reserved.13. This is where Six Sigma gets its name.5. 2003 . the sample standard deviation.5 = 0. but everyone who uses statistics does it. We already know that A is the square root of 24. Daniel Sloan and Russell A. into the New Management Equation for data vector D1: 25 = 24. The New Management Equation for Variable 2 works the same way: (D2)2 = A2 + (V2)2 We know from Figure 6 that D2 = 5.5 = 0. 25 and 24.952 + (V2)2 29. Please do keep your eyes on the right triangles in the illustrations.

All Rights Reserved. This is just background information. Boyles. 2003 . Everything just works better when the profit signals are large/strong and the noise is small/weak. Think of what follows as a mandatory Federal Communications Commission announcement on your National Public Radio station. software takes care of all this stuff.) We get the coordinates of the variation vector by subtracting the data average vector from the data vector. The coordinates of the variation vector for Variable 1 are given by: For Variable 2. or you can stay tuned. Daniel Sloan and Russell A. we certainly hope our pilot and co-pilot have this information at their fingertips. they are given by © M. Degrees of Freedom Don’t panic. (Whenever our airplane takes off or lands. It has to be here to ensure we are not breaking any Laws of the Universe. There is less waste and rework. the way they appear in a data matrix. You can skip this section if you want. In either case. Six Sigma values smaller variation because outcomes are more predictable. Sometimes an analyst might want or need to know the actual coordinates of a variation vector. The clearest way to explain the subtraction of vectors is to give the vectors a vertical orientation.Standards of Evidence 65 The sample standard deviation for Variable 2 is 3 times larger than that for Variable 1! Variable 2 is 3 times more variable than Variable 1. Predictions are more accurate.

here is an important Law of the Universe: The coordinates of a variation vector always add up to zero.66 Standards of Evidence So far. All the vectors are now n-dimensional. The more often you go there. for example (3. the less scary it becomes. Now let n stand for the number of objects in your data set. so good. The vector of averages for both of these is (4. Because of this. 5). 5) or (5. 4. This means that a two-dimensional variation vector is completely determined by its first coordinate. We are back into genuine hyperspace again. This is the same as the number of rows in your data matrix. Daniel Sloan and Russell A. © M. 4). This means that a three-dimensional variation is completely determined by its first two coordinates. We express this by saying that a two-dimensional variation vector has one degree of freedom. Visits become more profitable. Suppose now we have a three-dimension data vector. They become fun. We express this by saying that a three-dimensional variation vector has two degrees of freedom. It is your sample size. the first variation vector is: and the second is: In a three-dimensional variation vector. 2. All Rights Reserved. Boyles. the second coordinate in a two-dimensional variation vector is always equal to the negative of the first coordinate. Once again using the vertical data-matrix orientation. 4. Now. 2003 . the third coordinate is always equal to minus the sum of the first two coordinates. Yes.

We now return to our regularly scheduled program of writing with an improved degree of simplicity. 2003 . The typical bar chart presents totals or averages with no consideration of variability. “This bar is bigger than that bar is and I know the reason why because I am a scientific manager and I say so. At worst. Don’t blame us—it’s a Law of the Universe. All Rights Reserved. we might say bar charts have a 50/50 chance of giving correct information because they consider only one of two aspects. Consequently. Daniel Sloan and Russell A. as it usually is. They are the “Gee Whiz” graphs in Huff ’s Lying with Statistics. and also in Chapters 5 and 6. The upshot of all this is this: the standard deviation is exactly equal to the length of the variation vector only when n = 2. A vector analysis forces us to consider both average and standard deviation. they frequently are used to misrepresent data. At best. Bar Chart Bamboozles Bar charts and pie charts symbolize old-school management thinking as no other icon can. This means that an n-dimensional variation vector is completely determined by its first n-1 coordinates. We will come back to this later in the chapter. there is no Noise. which is statistical variation.” © M. They are easy to use. they encourage managers to use Frederick Taylor’s thinking.Standards of Evidence 67 The last coordinate of an n-dimensional variation vector is always equal to minus the sum of the first n – 1 coordinates. When n is greater than 2. This violates a Law of the Universe. They present data in superficial ways. There is no Chance variation. There is always Noise. Variation is a physical property of objects and measurements. We express this by saying that an n-dimensional variation vector has n -1 degrees of freedom. There are no deviations from the average. In other words. we have to divide the length of the variation vector by the square root of its degrees of freedom. Boyles.

All Figure 7 really does is graphically frame the differences between the annual totals. Daniel Sloan and Russell A. Table 4 Monthly revenue for four years. Boyles. consider the monthly revenue data in Table 4. This poetic license gives everyone the freedom to take credit for good results.68 Standards of Evidence As an example. there is no way to tell whether the “trend” is a profit signal or noise. Because Laws of the Universe are ignored. whether or not they are true. This is a snapshot of data entered into a spreadsheet. This is a bit like trying to ignore gravity. The annual totals are plotted as a bar chart in Figure 7. © M. The Marketing Manager would certainly want to take credit for this. All Rights Reserved. The upward trend looks very encouraging. Figure 7 Excel’s popular bar chart/ trend line combination is like Romantic poetry. 2003 .

The data average vector has one degree of freedom because one number. Chief Financial Officer. This leaves three degrees of freedom for the variation vector. Anyone can ask any question because all the data are in view. There are several things wrong with the “analysis” in Figure 7. This is a vector analysis in four-dimensional hyperspace. For purposes of illustration. It must follow the rules of vector analysis.36 = 140. 2003 . not to mention career limiting. to question the President. Daniel Sloan and Russell A. we will present two vector analyses that use only the four annual totals. comes from the status of the person telling the story rather than the evidence in the data. In this case s = 0. or a company founder who created spreadsheet software. There is no cross-examination of the reported results because it is considered poor form. including the raw data. s. the analysis method itself is held to high standards. The lengths of the vectors are related by the New Management Equation. The vector analyses illustrated below represent the international standard. the average of the four data points. The credibility of the results portrayed by the chart. 140. Table 5 lays out the basic vector calculations for the sample standard deviation. it uses only the annual totals instead of the original monthly data. and the explanation for them.Standards of Evidence 69 Corporate cultures that use cost-accounting variance analysis as the standard decision-making tool often use bar charts and trend lines to present “results” like Figure 7 based on data like that in Table 4. because there are four data points.30 + 0. All Rights Reserved. In corporate cultures that base decisions on objective standards of evidence. Boyles.14. it must follow the Laws of the Universe. Managing Director. The first of these is given in Table 5. [C2 does equal A2 plus B2. determines it. The analysis must have transparency. For one thing. Evidence is admissible if and only if the analysis method takes all aspects of the data into account.] © M. All elements must be available for review. Quite simply.06.

14 -3s -2s -1s 0 +1s +2s +3s 6. This function name is short for “sum of squares”. The dots just above the horizontal axis represent the four annual totals. 2003 . Figure 8 Figure 8 The four annual totals from Table 4 and the corresponding Normal distribution curve. Boyles. The syntax for the Excel calculation is: = SUMSQ(cell range) Table 5 Basic vector analysis of the four annual totals (millions of dollars). 0. Each vertical dotted line represents one standard deviation.70 Standards of Evidence We used Microsoft Excel to create the visual presentation in Table 5. All Rights Reserved.50 5. We must conclude that the deviations from the mean © M.92 All four data points lie within two standard deviations of the mean.34 5. This is appropriate because the squared length of a vector is the sum of the squares of the coordinates. �������� ������ ���� ���� ���� ���� ������������������ ��������������� ��������� ������������������� � ������ � � � ������������ ������ ���� ���� ���� ���� ���� ������ � � � ��������� ������ ����� ���� ����� ���� ���� ���� ���� ���� ����������������� � ������������������� �� ���������������������������� shows the Normal distribution curve corresponding to a mean of $5. Daniel Sloan and Russell A.92 million and a sample standard deviation of $0. The squared lengths of the vectors were calculated by using the cell function SUMSQ.14 million.

the variances would be biased. It is a special kind of hypothesis. 71 Our second vector analysis addresses directly the validity of the bar graph trend line in Figure 7. The visual presentation of the analysis is shown in Table 6. 5. because Ronald Fisher invented it. There is certainly no evidence of significant differences among these totals. It is called the F ratio. That leaves two degrees of freedom for the noise vector. When we divide the profit signal variance by the noise variance we get a signal-to-noise ratio that measures the strength of evidence against the null hypothesis. or F statistic. The variation vector is broken up into the sum of profit signal and noise vectors. In this case F = 2. The null hypothesis for this analysis is the following statement: There is no significant trend in the annual totals.923 in this case. As a result. To get the profit signal and noise variances.k. Without this adjustment. can we say there is a significant trend in the annual totals. These three vectors are related by the New Management Equation (a.843. This is a Law of Universe. and only then. This is not a foregone conclusion. The squared lengths of the vectors are also called “sums of squares. Then. All Rights Reserved.Standards of Evidence value are a result of natural. so it is completely determined by the slope of the best-fit line. or Chance. variation. 2003 . Daniel Sloan and Russell A. Boyles. Pythagorean Theorem).” The profit signal vector is equal to the best-fit line in Figure 7 minus the data average. It is used in applied research all over the world. The coordinates of the profit signal always add up to zero. Larger values of F imply stronger evidence against the null hypothesis. © M. the profit signal vector has one degree of freedom. The idea is to see whether or not the evidence in the data is strong enough to discredit the null hypothesis. we divide the sums of squares by the degrees of freedom.a.

By established international standards.843 by chance alone.234 in Table 6 is the probability of getting an F ratio as large as 2. There is no significant trend. we reject the null hypothesis. All Rights Reserved. Table 7 shows the monthly revenue numbers in data matrix format.15. there is a ‘preponderance of evidence’ against the null hypothesis.” This is a reference to the New Management Equation. Instead. If the p-value is small enough. Daniel Sloan and Russell A. The squared lengths of the vectors are also called “sum of squares. we interpret it relative to a statistical distribution representing chance variation. If the p-value is greater than 0.05. But there is no standard scale of comparison for the F ratio. Boyles. This number doesn’t seem very large. 2003 . This data set is too large to use as a tutorial. The p-value in Table 6 does not meet even this lowest standard of evidence.72 Standards of Evidence Table 6 Ilustration of the vector analysis for a linear trend in the four annual totals.05 but less than 0. which involves a sum of squared numbers. We present some smaller examples in Chapter 5. The p-value of 0. the evidence against the null hypothesis is ‘clear and convincing’ if the p-value is less than 0. This distribution depends on the degrees of freedom for the profit signal and noise vectors. © M.

All Rights Reserved. © M. Boyles.Standards of Evidence 73 Table 7 The monthly revenue numbers in data matrix format (thousands of dollars). Daniel Sloan and Russell A. 2003 .

In turns out these were the last three months before a change in the accounting procedures. It doesn’t take a Statistician to see that there is no trend here. This is done in Figure 9. All Rights Reserved.74 Standards of Evidence Meanwhile. William Hunter and J. A design team is arguing over the wear rates of shoe-sole materials A and B. You can quickly see the differences between a typical spreadsheet analysis and vector analysis applied to a data matrix. Stuart Hunter. with an invented story line based on our consulting experiences. Engineers © M. Material A. Boyles. Daniel Sloan and Russell A. a great deal can be learned simply by plotting the data in time sequence. 2003 . The only features of note are the three low points at the beginning of the series. The manager wants to go with Material B because it is cheaper. Data Analysis and Model Building by George Box. cost and margin analogies are appropriate. They should have been omitted from the analysis. 2. the current specification. is more costly than Material B. and his spreadsheet analysis shows there will be no significant loss of durability. The manufacturing design.20 It achieves the following objectives: 1. ������� ������� ������� ������� � � � �� �� �� �� �� �� �� �� ������ �� The Game is Afoot Another example of a full vector analysis is the shoe-sole wear rate workshop in the classic 1978 text Statistics for Experimenters: An Introduction to Design. This example uses the small data set presented in their book. just random variation. ��������������������� ������� ������� ������� ������� Figure 9 The monthly revenue numbers plotted in time sequence.

�������������������� ����� ����� ���� ���� � � � ���� � � ���������� ���������� © M. ��� ���� ����� ���� ��� ����� ������ ����� ������� ����� ���� ���� ��� ���� ����� ����� ������ ����� ���� ���� ��������������������� ����������������������� ���������� ������������������ ��������������������������� ���������� ����� Ten boys were enlisted for the test.Standards of Evidence 75 are concerned that Material B is not sufficiently durable. the manager concludes that the difference in durability is irrelevant. there were a number of cases where Material A actually wore out faster than Material B! The manager is elated. Given the price difference between the two materials. 2003 . The average wear rate for Material B comes out 0. an increase of 3. Furthermore. Boyles. as shown by the bar chart in Figure 11. Each boy wore one shoe made from Material A and one from Material B.86%. All Rights Reserved.) ����������� ��������� ����� ���� ����� ����� ����� ���� ���� ����� ���� ����� ������ ����� ��� ���� ���� ����� ����� ���� ����� ���� ����� ����� ����� ���� ����� ����������� ��������� ����� ���� ����� ����� ����� ���� ���� ����� ���� ����� ������ ����� ���� Figure 10 Wear rate data as arrayed in Excel.41 units higher than for Material A. Coin tosses were used to randomly assign Material A to the left or right foot for each boy. By using Figure 11 Wear rate data as analyzed by a spreadsheet bar graph. Data has been collected and arrayed in a spreadsheet. Daniel Sloan and Russell A. (See Figure 10.

Nevertheless. After a long and difficult team meeting. People have places to go. the shoe manufacturer can increase profit margins and maintain product durability. � The Black Belt trainee starts her extemporaneous presentation by stating the null hypothesis for the analysis: “There is no difference between the average wear rates of the two materials. The idea is to see whether or not the © M. they give her five minutes. a straw man to be pulled apart by evidence. The Excel reconstruction literally took 10 times longer than doing a correct vector analysis in the statistical package. The company will replace material A with the less costly. This change will be worth millions to the bottom line. a Six Sigma Black Belt in training asks if she can analyze the data herself using a vector analysis applied to a data matrix. things to do. For present purposes. As the meeting is wrapping up.) Table 8 Vector analysis of the wearrate data. equally durable material B. 2003 . She imports their Excel spreadsheet into her statistical package. It is getting late. Daniel Sloan and Russell A. consensus is reached. (We timed both methods. This is shown in Table 8. we recreate her vector analysis data in Excel. to maintain good relationships.” The trainee explains that this is a hypothesis. All Rights Reserved. rather than a foregone conclusion.76 Standards of Evidence material B instead of A. Boyles.

the differences should be symmetrically distributed around zero. the average difference should be close to zero. the data average and the profit signal vector are one and the same. With three clicks of her mouse. Pointing at the graph. For analyzing matched pairs like we have here. 2003 . We need to complete the ������������ Figure 12 Frequency histogram of differences in wear rate (B minus A). All Rights Reserved. Boyles. If the null hypothesis were true. Daniel Sloan and Russell A. “But let’s not jump to conclusions. As you can see. This casts doubt on the null hypothesis—the wear rates for Material B are consistently higher than those for Material A. the trainee produces a frequency histogram of the differences (see Figure 12). ������������� ����� � ��� �� ��� � ���� vector analysis to establish the strength of this evidence. © M. 77 She goes on to explain that we should be looking at the differences between A and B for each boy—that was the whole point of having each boy wear one shoe of each kind. “As you can see. the vector analysis (Table 8) breaks the vector of differences into the sum of the data average vector and the noise vector.Standards of Evidence evidence in the data is strong enough to discredit the null hypothesis. all but two of the differences are positive. she says. Also.

When we divide the profit signal variance by the noise variance we get a signal-tonoise ratio that measures the strength of evidence against the null hypothesis. We are back on task. we have to reject the null hypothesis. the profit signal vector and noise vector are related by The New Management Equation. and it is ‘beyond a reasonable doubt’ if the p-value is less that 0. 2003 . “The F ratio can’t be interpreted on its own.k. The vectors are 10dimensional because there are 10 differences. “By established international standards. The profit signal vector is determined by one number.a.a. beyond a reasonable doubt. That leaves nine degrees of freedom for the noise vector.215. the average difference of 0.0085. As you can see. “the p-value in this case is 0.” © M.41.k. It is called the F ratio because a guy named Fisher a long time ago invented it.215 could have occurred by chance alone. the lengths of the vector of differences. This means there is a significant difference between A and B. As you can see.” After some uncomfortable laughter. the Black Belt’s Six Sigma analysis continued. We are way into hyperspace. the F ratio in this case is 11.78 Standards of Evidence “As you can see. so it has one degree of freedom. the evidence against the null hypothesis is ‘clear and convincing’ if the p-value is less than 0. We have to compare it to a distribution to see how likely it is that a value as large as 11. pointing at her computer screen. All Rights Reserved.” she said.” Her presentation was interrupted by one of her friends. “OK. “This gives us Variances that measure the strength of the profit signal and noise vectors. “Let’s take a pause for just a moment here to do a little yoga stretching while our minds are bending. squared lengths of vectors) by dividing by the degrees of freedom. This probability is called the p-value. If the p-value is small enough. We have to adjust the New Management Equation (a.01 (Table 9). sums of squares a.05. Daniel Sloan and Russell A. Boyles.

While teaching the real Analysis of Variance we often hear the comment.41 units could cause problems. One engineer says. Their firstwave Black Belts are now in Master Black Belt training using their own case studies. Spreadsheet versus Data Matrix Spreadsheet arithmetic is today’s cost-accounting variance analysis computing engine. Just another day in the life of a Six Sigma company. we felt that a difference of 0. “That makes a lot of sense. We were afraid yield losses would exceed the savings on material costs. The next Black Belt. “So what’s the big deal with a data matrix? You can do all that in a spreadsheet. The company’s reputation for quality is preserved.Standards of Evidence 79 Table 9 The Black Belt showed the table of evidence to the team. our question is this. Critical-to-quality characteristics and financial margins are protected.” This is true.21 Unless you and your loved ones have nothing better to do with the rest of your lives. Even though the difference was less than 4%. Yellow Belt and Champion courses are filled to capacity. The company takes the next step forward by implementing Six Sigma across all projects and functional responsibilities in the corporate matrix. Daniel Sloan and Russell A. “Why would anyone want to?” © M. Some of that work has been presented in this chapter.” A potential disaster is narrowly averted by using an evidencebased decision in the nick of time. There is more to come in Chapters 5 and 6. 2003 . Boyles. It is also true that you could eventually compute the orbital trajectories of all the planets in our solar system with an abacus. We know because we have done it. Green Belt. All Rights Reserved. The waiting lists for the following sessions are long.

We did in fact make Tables 5. Boyles. statistical packages automatically create the variation. Like Keats. Other spreadsheet characteristics are simply inconvenient or annoying. For example. You can write formulas. They require the correct data matrix structure—each row an object of interest. a blank cell indicates a missing value in a data vector. A missing value changes the degrees of freedom and dimension of the vector. even Analysis of Variance. The Laws of the Universe do not apply to them. © M. For example. These and related comments are summarized in Table 10. giving incorrect results. you can put your data wherever you want it and analyze it however you want. you can actually do some statistics. Spreadsheet applications are unruly and Lawless. Daniel Sloan and Russell A.) Vector analysis provides the transparency required to satisfy international accounting standards and scientific standards of evidence. In reality. If you add in enough add-ins. each column a vector of data on the objects. many spreadsheet functions treat blank cells as zeroes. 6 and 8 in a spreadsheet. although it does affect the results. follow the Law. The greater liability in trying to do everything with a spreadsheet stems from the very freedom that makes spreadsheets so popular. One can create this table in a spreadsheet. Statistical packages. 2003 . All Rights Reserved. By contrast. although it is tedious. the cavalier insertion of zeroes for missing values wreaks havoc on vector analysis. profit signal and noise vectors shown in Tables 5.80 Standards of Evidence The spreadsheet is a marvelous invention. no requirement for transparency. It automates arithmetic. There is no requirement for vector analysis. The vector analysis can handle this. (Hence the name. The undemanding nature of spreadsheets lures unsuspecting users into sins of omission. This works fine for adding and subtracting. Adding in the add-ins is a clumsy way of trying to reinvent the machinery of a vector analysis that already exists in modern statistical software. 6 and 8. but nothing forces other users to do so. Data vectors are the principle components of vector analysis. These programs give you access to this machinery with a mouse click. on the other hand.

the null hypothesis. 2003 . the straw man concept began as a rodeo safety tactic. There is no relationship between these two variables.22 A straw man would distract bulls. The phrasing of a null hypothesis is not a law of the universe. Inductive and deductive reasoning are built into data matrix software. Profit Signals. Boyles. All Rights Reserved.Standards of Evidence 81 Table 10 Comparing and contrasting a spreadsheet and a data matrix. It could be torn apart with no harm done. Here are some examples: • • • • • There is no difference between these two ways of doing things. There are no relationships between these two groups of variables. Daniel Sloan and Russell A. P-values. but it is an odd standard. Confidence Levels and Standards of Evidence A null hypothesis always consists of a negative assertion. No such discipline exists in a spreadsheet. We can tear apart the straw man. There are no differences among these three or more ways of doing things. According to the on-line folklore database Wikipedia. There are no relationships among these three or more variables. © M. if it is something we would like to disprove based on the data. The null hypothesis often plays the role of a “straw man” in inductive reasoning.

The formula to produce the p-value 0. We get around this by working with a probability computed from the F value.843 is the value of the F ratio. 1. 2) 2. The distribution to which the F ratio is compared depends on the degrees of freedom for the profit-signal and noise vectors. 9) 11. All Rights Reserved. is a signal-to-noise ratio that measures the strength of evidence in the data against the null hypothesis. the cell formula syntax for calculating the p-value is this: = FDIST(value of F ratio. or F statistic.82 Standards of Evidence The F ratio. and 9 is the number of degrees of freedom for the noise vector. Enter this formula into your Excel spreadsheet and you will get the correct answer: 0. Daniel Sloan and Russell A.0085. As a result. In Microsoft Excel.215. 1 is the number of degrees of freedom for the profit signal vector. 1. the formula to produce the p-value 0. 2003 . If the p-value is small enough. As the F ratio increases. degrees of freedom for the profit signal vector. is the probability of getting an F ratio as large as the value we got by chance alone.215 is the value of the F ratio. 1 is the number of degrees of freedom for the profit signal vector. we reject the null hypothesis.843. the strength of evidence against the null hypothesis increases. there is no standard scale of comparison for the F ratio. Enter this formula into your Excel spreadsheet and you will get the correct answer: 0.0085 in Table 8 is as follows: = FDIST(11.234 in Table 6 is as follows: = FDIST(2. degrees of freedom for the noise vector) For example. and 2 is the number of degrees of freedom for the noise vector. Boyles. We evaluate an F ratio by comparing it to a statistical distribution to see how likely it is that a value that large could have occurred by chance alone. This probability. called the p-value.234. © M.

Standards of Evidence

83

We do not like writing spreadsheet formulas. We do like the fact that statistical software does it for us automatically. As the F ratio increases, the p-value decreases. As the p-value decreases, the strength of evidence against the null hypothesis increases. This tends to confuse people. It is easier to think in terms of confidence levels (Table 11). The confidence level is one minus the p-value, usually expressed as a percentage. As the confidence level increases, the strength of evidence against the null hypothesis increases.

Table 11 Standards of evidence in a nutshell. A p-value less than 0.05 yields a confidence level greater than 95%. A p-value less than 0.01 yields a confidence level greater than 99%.

**Closing Arguments Themis is the Blind Lady of Justice in Greek mythology.
**

Themis: “As an oracle, I used to advise Zeus when he made

decisions. I did my job so well I became the goddess of divine justice. You can see from some of my portraits that I used to carry a sword in one hand and a set of scales in the other. The blindfold I wore was more than a fashion statement. It meant I would be fair and equitable in my judgments. My whole existence hinges on objective standards of evidence.”23

©

M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

84

Standards of Evidence Endnotes American Heritage Dictionary of the English Language, Third Edition. Boston. Houghton Mifflin Company. 1992.

1

Dawkins, Richard. Unweaving the Rainbow, Science Delusion and the Appetite for Wonder. Boston, Houghton Mifflin Company, 1998.

2

Huff, Darrell and Geis, Irving. How to Lie with Statistics. New York, W.W. Norton and Company. 1954.

3

Taylor, Frederick Winslow. Scientific Management, Mineola: Dover Press, 1998. pages 55-59. The original 1911 version was published by Harper and Brothers, New York and London..

4 5

Oxford English Dictionary, 1989.

Garrison, Ray H. and Noreen, Eric W. Managerial Accounting, 10th Edition. Boston, McGraw-Hill Irwin, 2003. Page 431.

6

Harrison, G. Charter. Cost Accounting to Aid Production – I. Application of Scientific Management Principles. Industrial Management, The Engineering Magazine, Volume LVI, No. 4, October 1918.

7

Harrison, G. Charter. Cost Accounting to Aid Production – I, Standards and Standard Costs, Industrial Management, The Engineering Magazine, Volume LVI, No. 5, November, 1918.

8

Harrison, G. Charter. Cost Accounting to Aid Production – I, The Universal Law System. Industrial Management, The Engineering Magazine, Volume LVI, No. 6, December, 1918.

9

Johnson, H. Thomas, and Kaplan, Robert S. Relevance Lost, The Rise and Fall of Management Accounting. Boston: Harvard Business School Press 1991. Pages 10-12.

10

11

Anthony, Robert N., and Reece, James S., Accounting: Text and Cases, Eighth Edition. Homewood, Irwin, 1989. Page 15.

©

M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

Standards of Evidence

12

85

MacKay, Charles, Memoirs of Extraordinarily Popular Delusions, Copyright 2002 eBookMall version available for $1.75. http://www.ebookmall.com/alpha-authors/m-authors/ Charles-MacKay.htm MacKay, Charles, Memoirs of Extraordinarily Popular Delusions, Copyright 2002 eBookMall version available for $1.75. http://www.ebookmall.com/alpha-authors/m-authors/ Charles-MacKay.htm Page 8.

13 14

Gardner, Martin. Fads and Fallacies in the Name of Science. New York, Dover Press, 1957. Page 106. Sagan, Carl. The Demon Haunted World, Science as a Candle in the Dark. New York, Ballantine Books, 1996. Page 241. Anthony, Robert N., and Reece, James S., Accounting: Text and Cases, Eighth Edition. Homewood, Irwin, 1989. Page 941. Oxford English Dictionary, 1989.

15

16

17

Box, Joan Fisher. R.A. Fisher: The Life of a Scientist. New York: John Wiley and Sons, 1978. Page 97.

18

Box, Joan Fisher. R.A. Fisher: The Life of a Scientist. New York: John Wiley and Sons, 1978. Page 100-102.

19

20

Box, George E.P., Hunter, William G., and Hunter, J. Stuart. Statistics for Experimenters, An Introduction to Design, Data Analysis, and Model Building. John Wiley & Sons. New York. 1978. Dilson, Jesse. The Abacus, The World’s First Computing System: Where it Comes From, How it Works, and How to Use it to Perform Mathematical Feats, Large and Small. New York, St. Marten’s Press. 1968. http://www.wikipedia.org/wiki/Straw_man http://www.commonlaw.com/Justice.html

21

22

23

©

M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

86

Standards of Evidence

©

M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

Chapter 3 Evidence-based Six Sigma

S

ix Sigma (6σ) is a proven, pursuit-of-perfection business initiative that creates breakthroughs in profitability, productivity, and quality. It is a highly structured, project-by-project way to generate bottom line results. It produces significant dollar value through a never-ending series of breakthrough projects. Evidence-based decisions characterize the 18-year, 6σ record of accomplishment. The essential elements of Six Sigma breakthrough projects are vector analyses applied to data matrices. Hundreds of millions of dollars have been placed directly onto the bottom line of companies around the world using this improvement model and its tool set. Though large multi-national corporate results have attracted the most media attention, we have personally seen a 26-employee plastic pressure and vacuum forming company achieve proportionally identical results.

Six Sigma knowledge and know-how have evolved since the notion of perfect 6σ quality was first conceived by Motorola engineer Bill Smith. Motorola’s Chief Executive Officer at the time, Robert Galvin, was the first Six Sigma Champion. He enthusiastically led the entire program. He personally removed bureaucratic obstacles to breakthrough improvements. Six Sigma became an education and training commodity during in the late 1990’s. It gains momentum as it matures.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

2003 . Daniel Sloan and Russell A.1) Computing power transforms what was once an almost impossibly difficult series of matrix algebra calculations into a single computer command “Run Model. Six Sigma also conveys substance. Corporate executives embrace them even though only a few know what the phrase and acronym mean. Since data matrix applications are essential to vector analysis.88 Evidence-based Six Sigma The catchy three syllable “Six Sigma” moniker is value-added packaging for vector analysis and objective evidence. A Six Sigma analysis is a vector analysis applied to a data matrix. two currently dominate the market: Minitab and JMP.” Anyone who wants to correctly analyze measurement data can now do so in seconds. Wall Street likes 6σ because it ties customer satisfaction directly to corporate profitability. We answered that call. quality information. Customer satisfaction. The mass market of Six Sigma calls for better branding. Though many products are available. every manager in a Six Sigma company has data matrix software loaded on their personal computers. calling them Six Sigma Tools has worked wonders. if she or he expects to be promoted. This analytic process is sometimes called an Analysis of Variance. or ANOVA. speed. Since the acronym and its equations are traditionally presented in ways that are guaranteed to bore even motivated academics. analyzed and is rewarded. Two are priceless business intelligence commodities: 1) Profit Signals and 2) Noise. the principles of accelerated adult learning and handson improvement projects. (Historically. Profit Signals have been called “treatment deviations. What is valued gets measured.” That appealed to engineers and statisticians. As we graphically detailed in Chapter 2. and lean organizational structures are Six Sigma cultural values. Six Sigma gets its name from the vector analysis results. © M. Six Sigma measurements are recorded in data matrices. When a company combines computing power. Boyles. Every Six Sigma champion executive and. every true 6σ company has its own corporate software standards. An ANOVA breaks raw data into six vectors (Figure 1). breakthroughs routinely lead to quantum leaps in profitability. All Rights Reserved. That is a remarkable accomplishment in anyone’s marketing book of records.

Authentic 6σ executives eschew the use of spreadsheet bar graphs and pie charts. 3. The jargon side of this business initiative is as real as it is regrettable. Executive compensation and promotion are tied to the use of data-driven. which means decision makers know how to use a vector analysis applied to a data matrix. Correct. Education and skill training in the recognized body of knowledge (BOK) permeate Six Sigma organizations. If an executive champion does not meet the challenge of these responsibilities. evidence-based decisions. Boyles. Daniel Sloan and Russell A. is an expected competency for every leader. New ways of getting work done. All Rights Reserved. Exponential rates of improvement are an expected outcome. rule driven analyses of financial and productivity data are evident in Six Sigma executive presentations.Evidence-based Six Sigma 89 Figure 1 A complete analysis is composed of six vectors. Top-level executives personally lead the Six Sigma initiative in highly visible ways. 2003 . 2. with fewer © M. the Six Sigma initiative will fail to produce promised results. We identify these hieroglyphics as a courtesy orientation to newcomers. Six Sigma (6σ) Basics Here is the bullet list of Six Sigma basics.2 Computing literacy. Profit Signals quantify what matters most. 1. The litmus test of leadership is the replication of high dollar value breakthrough projects. Acronyms and algebraic symbols are Six Sigma grammar.

90

Evidence-based Six Sigma resources, and in a fraction of the time required by previous methods, take precedence over incremental process improvements. 4. Measurements and Six Sigma metrics are tied to shortterm and long-term financial performance. Executive Six Sigma leaders allocate significant personal time and resources for 6σ projects. In addition to their own investments, they assign the company’s most capable people full-time to lead Six Sigma breakthrough projects. The Executive’s job is to remove bureaucratic roadblocks to improvement so that managers who have an aptitude for implementing productive changes can succeed. The corporate Six Sigma job description hierarchy resembles titles earned in a martial arts dojo. Full-time Six Sigma professionals, called Black Belts, are expected to be able to “kick the heck out of ” any variation that leads to waste or rework.3 In addition to a Karate/Tai Kwan Do/Kung Fu/ Judo level of intellectual aggressiveness, Black Belts must demonstrate leadership and good interpersonal skills. They must be masters of evidence-based decision principles. Ideally, sensei executive champions coach and mentor 9th degree Master Black Belts, who in turn coach, mentor and lead Black Belts. Black Belts then coach and supervise Green Belts and Yellow Belts. Education and training permeate the organization. Eventually every employee actively contributes to the production of breakthrough project results: cold cash to the bottom line.

The Six Sigma Profit Strategy Six Sigma improves profits by aiming at perfect products, services, and processes. In a 6σ culture, everyone is expected to enthusiastically argue in favor of perfection. A passionate work ethic attitude carries weight in a Six Sigma culture. Protests over the possibility of a “diminishing rate of return” indicate an individual does not understand 6σ fundamentals. The lower case Greek letter, σ, is pronounced ‘sigma.’ In the professional world, σ is the symbol for the population

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

Evidence-based Six Sigma

91

standard deviation. The sample standard deviation, along with the five other elements in a complete vector analysis, comes from raw data. It quantifies the amount of random or chance variation that occurs around the average in any, and every, given set of data. To understand and embrace the universal Generalization of Chance Variation is to enter the world of Six Sigma. Try the following experiment to demonstrate this physical law for yourself. First, find a friend you admire. Choose someone with whom you can discuss controversial information. Now, each of you needs to print the letter “a” 10 times on a piece of paper in the exact same way with no variation.4 Go on. Try it. This exercise is a trick. The task is completely impossible. Differences in writing tools, variations in ink, paper texture, handedness, fatigue, font, attention span, concentration, your interpretation of our instructions, and an infinite number of other variables all contribute to natural variation. Natural variation is present everywhere and always. It is ubiquitous. It is a law of our universe, as powerful as gravity. Every good product and every service suffers from the inconsistencies caused by variation. J. Bernard Cohen, the eminent historian, considers knowledge of Chance and/or statistical variation to be the distinguishing characteristic of our generation’s Scientific Revolution. “If I had to choose a single intellectual characteristic that would apply to the contribution of Maxwell [though not directly to his revolutionary field theory], Einstein [but not the revolution of relativity], quantum mechanics and also genetics, that feature would be probability.”5 We agree. This Six Sigma Revolution in business and science is defined by evidence that is based on Probability rather than determinism.6 Like it or not, probability overthrows old doctrine. There is no polite way to summarize the impact variation has on an individual’s world view. Probability, dressed up in the Six Sigma costume, is replacing old ways of knowing—revelation, intuition, and reason—with the disciplined analysis of experimental observations. Six Sigma unifies the scientific method and business. Evidence-based decisions and the power in a vector analysis are the router connections between the two disciplines. In

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

92

Evidence-based Six Sigma answer to the meta-questions, “Does this Six Sigma stuff really work?” and, “Can you prove it by replicating your results?” The answer is unequivocally, “You bet.” With any and every set of raw data we can construct a tetrahedron, the cornerstone of statistical evidence. When a standard deviation is combined with an average, we can make valuable predictions based on a family of probability curves and surfaces (Figure 2). When one knows the average and standard deviation (σ) of a process, one can improve that process to near perfect, 6σ, performance. Perfect quality first time every time is valuable. This value can be measured with money.

Figure 2 Data matrix software automatically transforms the cornerstone of evidence into probability distributions.

Figure 3 illustrates old school 1980s corporate Quality Improvement (QI) aims. Way back then, ‘three-sigma’ quality was the target.7, 8 This means that the 6σ total process spread just fits between the lower and upper specification limits (LSL and USL). At best, this means that 99.7% of process outcomes satisfy customer requirements. This near 100% quality sounds better than it is. Recall the unacceptably wide variation in the prior chapter’s bar chart bamboozling comparison. At its best, a three-sigma 99.7% distribution promises ‘only’ 2,700 defective outcomes per million produced.

A three sigma process may actually produce as many as 67,000 mistakes or defects per million (DPM). This is because processes typically drift by about 1.5 standard deviations around their long term average.

©

M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

Evidence-based Six Sigma

93

Figure 3 Three-sigma quality means that the 6σ total process spread just fits between the lower and upper specification limits (LSL and USL). At best this means that 99.7% of process outcomes satisfy customer requirements.

To put these numbers into perspective, ‘three-sigma’ aviation safety would mean several airline crashes each week. In health care, it would mean 15,000 dropped newborn babies per year. Banks would lose thousands of checks daily. As it is, three sigma (3σ) quality costs businesses between 25-40% of their annual operating income in waste and rework. Six Sigma breakthrough projects aim to reduce the standard deviation. High-leverage processes that affect business, manufacturing, or health care delivery are the prime targets. The Six Sigma bell curve in Figure 4 covers only one half of specification range. This illustrates the effect of a smaller standard deviation, σ. The Six Sigma one-part-per-billion (PPB) Six Simga bell curve in Figure 4 covers only one-half of the specification

Figure 4 A Six Sigma capable distribution covers only one half of the specification range.

©

M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

94

Evidence-based Six Sigma range. This illustrates the dramatic financial benefit of reducing the standard deviation. Even when the process drifts, only 3-4 defective outcomes per million (DPM), can occur. In a σ =$1.00 example, a Six Sigma breakthrough would result in a standard deviation that equaled $0.50 or less. When this goal of perfection is achieved, costs related to waste, rework, inelegant designs, and needless complexity disappear. The proven rewards for achieving 6σ are, 1) excited customers and 2) improved profits. Historically, each Six Sigma project generates a 100-250K benefit. Full-time corporate 6σ Experts, Black Belts who currently earn about 120K in salary and benefits, lead three to four projects per year that generate $1 million in hard dollar, bottom line business benefit. This 10:1 rate of return is so dependable it has become a tradition. Prior to the development of Six Sigma in the late 1980s, the only people earning their livings full time using these tools for breakthrough projects were consultants. We were the only ones willing to study out-of-date textbooks, use handheld calculators, rulers, graph paper, and DOS programs. Thank heavens those days are behind all of us now. Anyone and everyone can enjoy the benefits of vector analysis applied to a data matrix. Six Sigma style profits are now a matter of personal choice. The Lucrative Project Results Map Flow diagrams and process maps simplify work. They make hidden process dynamics visible. Seeing waste and complexity helps people eliminate both. Flow diagrams like Figure 5 can also be used to create processes that produce perfect results. To read the diagram, begin with the hard copy documentation symbol at the upper left hand corner. Follow the arrows through each of the four levels to the right hand page bottom. The acronym used to describe the classic 6σ process is DMAIC. DMAIC stands for the iterative 6σ project cycle of Define, Measure, Analyze, Improve, and Control. Once a project is completed, the process described by this map begins again. This cycle never ends.

©

M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

Evidence-based Six Sigma 95 Figure 5 This flow chart has guided projects toward bottom line business results for years. ������������������������������������������� ��������� ���������� ������������� ��������� ��������� ������������ �������� �������������� ��������������� �������������� ���������������� ��������� ��������������� �������������� ������� ���������� ������������ ���������� ���� ������������ ��������� ������� ������������� ��������� ������ � ������������ ����������� ������������ �������� ���������� ���������� ����������� ���� ������� ����� �������� ����� ����������� ������ �������� ����� ������������ ��������� ������������ �������������� ����������� ������������� �������� ����������� ��������� ������������������ ����������� ������������� ����������������� ������ ����������� �������� �������� ������ �������� ��������� ����� ����������� �������� ���������� ��������� ����������� ���������� ������� ��������� ������ ������������ ��������� ������ ��������� �������������� ������������� �������� �������� ������������� ������������ ���������������������� �������� ������������������������� ������������������ ��������������� ��������� ������� ������������� ������� © M. 2003 . All Rights Reserved. Daniel Sloan and Russell A. Boyles.

2003 . Results interpretation. customer satisfaction and profit goals come first and last in the Six Sigma DMAIC cycle of improvement. Each of these steps takes time. We advise potential clients who are fond of their bureaucracies to stick with Old School Management methods. Six Sigma programs are seen as disruptive when a business values group think. “Don’t go there. and implementing improvements can and does flatten bureaucracy. there is broad-based organizational involvement. Analyze. As 6σ breakthroughs help companies surpass quarterly and annual financial targets. making an evidencebased decision. Occasionally organizations that value bureaucracy manage to “do Six Sigma” while they find ways to sustain paperwork. The middle three levels are Black Belt project tasks. Daniel Sloan and Russell A. This commitment is the key to perpetual breakthrough project success at the highest levels of the company. and supervisory redundancy. These folks just happen to draw an interesting set of Six Sigma project boundaries. Six Sigma window dressing is immediately apparent to any knowledgeable observer. Evidencebased decisions and Six Sigma will bring them nothing but trouble. Don’t laugh. The map marks the boundary of each phase.” In companies with a full commitment to evidence-based decisions. optimizing a system. Costaccounting reports and risky capital investment Proformas will be challenged with physical models. Control The voice of the customer (VOC). Senior management processes and decisions are off limits. All Rights Reserved. committees. improvement and control require close collaboration between top-level leaders and Black Belts. Boyles. © M. The ones we have worked with and for are populated with delightful. Improve. Many do. The series of five steps in the top row and the two final steps in the bottom row are top-level management and leadership responsibilities. so every 6σ project result needs to be substantial and financial. The process of interpreting statistical results. friendly people. Employees will openly question executives. long-term objectives are continuously upgraded to sustain momentum.96 Evidence-based Six Sigma Define. Measure.

The dollar per dollar return on investment. The successful disruption of projects generally returns the culture to less demanding performance standards. Boyles. We saw a most eloquent occurrence of this phenomenon in a CEO’s behavior. ephemeral means “dead in a day. Though it is management’s responsibility to keep the improvement fires burning. is also popular. In actual practice Six Sigma focuses relentlessly on completing projects within 90-120 days. he casually observed to the vice president in charge of implementing Six Sigma. Nevertheless it was informative to watch a Black Belt compete. They do what it takes to bring home the bacon. Make no mistake. organizational commitment to 6σ instantly wanes. © M. is still useful and very much in vogue. 2003 . the project management chart developed by Henry L. One day.9 A PERT (Evaluation and Review Technique) chart. The Six Sigma field is littered with the corpses of failed Black Belt Projects.” The VP looked the word up and discovered. Resistance to evidence-based decisions grew. ROI. Gantt in 1917. For example.Evidence-based Six Sigma 97 Six Sigma employs just about every effective management tool that has ever been developed. The Institution of Old School Management thinking does not surrender until it is surrounded and expelled. After a few months of Six Sigma hoopla. called a Gantt chart. to his dismay. was only 5:1. project delays and passive criticism are favored benign neglect techniques. Any project management tool you can think of that has proven to be useful is now called a Six Sigma Tool. Old school managers can and do successfully use neglect to sabotage Six Sigma.” As a side note. which provides an alternate Gantt chart view of a project. “Six Sigma is ephemeral. it was interesting to see this Six Sigma initiative generate about $6 million in bottom line benefits by the year’s end. Experience shows that if a Six Sigma project improvement team fails to deliver bottom line business dollar value within this time frame. All Rights Reserved. particularly an experienced Master Black Belt. Daniel Sloan and Russell A. people noticed that he wasn’t using evidence unless it supported the foregone corporate agenda. Projects were not being completed on time.

new projects are usually given serious review at quarterly and annual intervals. based on experience. Clear operational definitions are a fundamental part of project selection. 1-10. If senior management. © M. Since time is money and money is time. We have seen it improve interpersonal working relationships as it generates lists of breakthrough project targets. or the Black Belts who are assigned to the projects do not share a common understanding of these definitions. Boyles. problems arise.98 Evidence-based Six Sigma Therefore. Operational definitions must be practical. All Rights Reserved.10 �������������������� ��������� ���������� �������������� ����������� ��������������������������� ������������� � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � �� �� � � � � Lucrative Project Selection Selecting and prioritizing the most rewarding projects is a most important first step. each idea is ranked from low to high. The hopper is always open. but depending on the culture. Daniel Sloan and Russell A. This project prioritization process promotes a consensus style agreement that has some quantitative structure to it. 2003 . � ����������������� �������������������������� �������� �������� ���������� �������� ������������ ����� ���������� � � � � � � � � � � � � � � � � � � � � � ���������� ���������� ������������� ����� ������� �������� � ����� � ����� � ����� � � � ���� ���� ���� � ����� Table 1 You can program a spreadsheet to help you choose best projects. Six Sigma team leaders put breakthrough improvement project ideas into this hopper. the selection process must be efficient and fast. the project champion. The suggested project with the highest total project priority number is first and so on. These values are multiplied to create a priority rating. in five or more categories. During project review meetings. it stay focused on the money and deadlines. The project and its related issues must be defined operationally. we strongly recommend that once a company commits to evidence-based decisions. Table 1 is a simple. virtually universal project evaluation spreadsheet that has emerged as a favorite around the United States.

” So. it is considered to be part of a comprehensive operational © M. How many did you count? Pause to write your answer here before moving on. The definition must include an average and a standard deviation. a depression in the earth. All Rights Reserved. 99 Here is another classic example that illustrates why clear operational definitions are crucially important to even a simple process like counting. During the project selection phase. FOR CENTURIES IMPORTANT PROJECTS HAVE BEEN DEFERRED BY WEEKS OF INDECISION AND MONTHS OF STUDY AND YEARS OF FORMAL DEBATE. Your definition is correct. there are 6. perfect Six Sigma performance expectations called Critical to Quality (CTQ) or Key Quality Characteristics (KQC) are defined in statistical terms. the F in each OF sounds like a “v. zero is one correct answer. 2. Believe it or not. write down your definition of the word ‘pan. pan means bread.’ Good work. Count the number of f’s in the following paragraph. could be considered to be correct. So are at least 20 others. in other words you defined “f ” by the sound of the letter. If you proof read phonetically. italicized f ’s. experienced Six Sigma Master Black Belts and Black Belts are very specific when they define what it is that they intend to count or measure. Any one. There are no lower case. Both come from a vector analysis applied to a data matrix. Since the Profit Signal is also automatically produced by this analysis process. a cavity in the lock of a flintlock. Daniel Sloan and Russell A. So. Pan is a cooking container. 5. You can pan a camera or pan for gold.Evidence-based Six Sigma For example. 2003 . If you decided to count any F. Do more of what works. Without a statistical definition. if you defined an F by the way it sounds you could have counted 1. there can be no objective evidence. 4. 3. or all of these answers. or 6. For this very good reason. In Spanish. Boyles. taken in the context of its definition. and the Greek god of the woods. the misinterpretation of a three-letter word has been known to derail projects. _______ Depending on how you decided to define the letter “f ” there are seven possible correct answers.

Six Sigma tools raise the standards of what does and does not constitute a credible Proforma.11 Dr. Dr. All data and all elements are revealed. Charnes is a frequent contributor. to create a 16 module self-guided study course that we think is excellent. An abacus cannot beat a super-computer no matter how fast one’s fingers are. As the Area Director for Finance Economics. © M. Boyles. Legitimate financial forecast models are created using vector analysis rules. almost any story holds water. Daniel Sloan and Russell A. All Rights Reserved. Increasingly accurate predictions put the world of continuous spreadsheet revisions to shame.100 Evidence-based Six Sigma definition. The old school cost-accounting variance analysis encourages confabulation by eliminating 5 vectors. This is a covert impropriety if there ever was one. Once operational definitions are agreed to. Financial Modeling and Simulation Six Sigma budget models are dramatically different from. and masking the other five vectors. and Decision Science at the University of Kansas. These figures are accompanied by the expected dollar value of benefits the company can look forward to harvesting. Charnes exemplifies leadership in the field. 2003 . spreadsheet arithmetic Proformas. When correctly employed. scenario. Every manager who has actually participated in the old-school ritual called “spreadsheet scenarios” must candidly admit to making the numbers up. By using only one vector. A reliable forecast is as transparent as an authentic analysis. and superior to. These high analytic standards are used at all levels in the organization. of all the information contained in raw data. or forecast. financial models are used to create credible bottom line profit signal estimates. He used the Decisioneering product called Crystal Ball. When projects have been identified and Key Quality Characteristics are defined. Financial Engineering News is one of many trade publications that helps professionals get up to speed on the use of these tools. John M. School of Business. or 83 percent. an average and a desirable standard deviation for project outcomes are targeted.

” If you look closely under the hood of reputable simulation applications. Working under old school constraints. and student drivers. All Rights Reserved. the historical data underlying each budget assumption are graphed in 2.Evidence-based Six Sigma 101 Figure 6 One popular Six Sigma software program uses flow diagrams to graphically detail the iterative cycle used to create and improve financial forecasts. This is why computerized simulation is a “Six Sigma Tool. With multi-dimensional computer simulations. with clouds representing thought processes. flying skills. engineers had to build expensive physical models to test their ideas. Simulation is proving to be as beneficial to financial managers as it is to engineers. doctors. 2003 . is shown in Figure 6. The model can then be simulated thousands of times in seconds. and entrepreneurial assumptions are created. correlations. surgeries. Surgeons had to test new techniques on live patients.12 © M. Boyles. Parents had to take their teenager into traffic and hope for the best. each has a data matrix and vector analysis for sparkplugs. Pilots had to practice first solo flights at 600 miles per hour. and even freeway entrances can be tested “off-line” first to minimize risk. jet pilots. The benefits to simulation are objective and overwhelming. In a data matrix driven budget forecast. Once assumptions are validated. multivariate models incorporating factor interactions. Daniel Sloan and Russell A. His open system flow diagram. The output is presented graphically. new designs. 3 and more dimensions prior to including that assumption in the forecast model.

They let leaders meet and beat breakthrough project goals. With a spreadsheet.102 Evidence-based Six Sigma Beginning in the late 1980s. Vector models do an impressive job of helping decision makers visualize probable outcomes. The geometry that guided their design creates graphic results that look terrific. It all looks legitimate! Though spreadsheet arithmetic sets the standards of evidence at a comfortably low level. They are a joy to use. budget forecasts inherit the power of a vector analysis. probability information. 2. Every day we thank the General Electric Senior Vice President who took time out of her day in 1997 for a cold call telephone interview. 1. This error is serious. multivariate systems. Without analysis context—an average. they produce illusion rather than insight. They are affordable tests. All Rights Reserved. a. and an analytic graph—people must guess at the number’s meaning in relation to the other variables in a system. multiplication and division are appropriate tools for analyzing complex. spreadsheet arithmetic budget models and “what-if ” scenarios fall short of evidence-based decision standards in significant ways. Daniel Sloan and Russell A. With the finance simulation tool add-in. a standard deviation. subtraction. many conclude equivalent answers are automatically produced with each one. 2003 . quantification. Her counsel was. inventive software manufacturers began to develop programs that forced spreadsheets to behave like a data matrix. These macros are now mature modeling programs. Boyles. Analysis rules. © M. She explained how these programs push Six Sigma forward. Many now are convinced simple addition. and remains. This number frequently misleads because it is not framed in a meaningful context. Because the columns and rows in a spreadsheet look just like the columns and rows in data matrix. an individual number in a cell is accepted on face value. Spreadsheets are wildly popular because they let anybody do absolutely anything with any number. Beyond the covert elimination of 5 analysis vectors. rock solid. continuous feedback and discipline improve model forecasts over time. Spreadsheets encourage analysts to believe in the great bamboozle.

“How easy it is to work over an undigested mass of data and emerge with a pattern. 4. answers are notoriously unreliable.Evidence-based Six Sigma 103 3.” 13 Simulation programs give spreadsheets like Excel a new lease on life. In corporate hierarchies these courtesies force otherwise intelligent. Spreadsheet scenarios are usually created using OneFactor-at-a-Time (OFAT) methods.14 This is a very good thing. Boyles. Conclusions reached using this method are at odds with physical Laws of the Universe. but have no basis in the exterior world. Daniel Sloan and Russell A. 2003 . People are persistent when it comes to juggling numbers. which at first glance is so intricately put together that it is difficult to believe it is nothing more than the product of a man’s brain… Consciously or unconsciously. Spreadsheet scenarios create a false impression of precision. With simulation. Not only do they not yield an accurate answer. All Rights Reserved. managers can perform tens of thousands of multivariate scenarios in minutes. By over-simplifying problems. Macros that follow the rules of evidence are bringing high standards to the world of accounting and finance. This is less time than it takes a skilled controller to © M. well-meaning people to forget what they know about mathematics. they waste time. analysts have a much better grasp on the range of possible budget outcomes. Spreadsheet numbers are dressed up in impressive looking data arrays. and framed as “certain forward thinking statements” in a social gesture of courtesy that smacks of hubris. their perceived dogmas twist and mold the objective facts into forms which support the dogmas. Martin Gardner’s Fads and Fallacies observation rings as true today as it did when he wrote it in 1952. “Hokum!” Nevertheless they are routinely presented. Human nature is tireless in its allegiance to irrational beliefs. accepted. Once a 3D cube model is embedded in a spreadsheet. OFAT analysis and experimentation methods are no more reliable now than they were when Frederick Taylor used them in the 26 years leading up to 1911. These images shout out. Likelihoods and probabilities are presented automatically in attractive visual graphs.

In addition. it does point out that the most probable outcome is a $14.104 Evidence-based Six Sigma complete a single. Daniel Sloan and Russell A.15 Since there are no rules. Compare and Contrast Analysis The classic budget forecast (Table 2) is usually created by estimating three outcomes: 1) best case. Figure 7 does not tell the manager what to do.2 million gain. Boyles. © M. the analyst or manager can evaluate which of the variables has the greatest impact on the bottom line. Simulation can increase one’s level of confidence as business decisions are made in the face of uncertainty.2 million. and 3) most likely case.4 million loss rather than a $9. and for no extra charge. Note how the forecast in the bottom right hand cell catches the eye. rather than being distracted by variables they think may be most important. They are multi-purpose. Figure 5 is a sensitivity chart illustration. Predictably. With a projected profit of $9.16 When this spreadsheet was analyzed 1. One-Factor-At-a-Time (OFAT) budget forecast scenario. However. factors are ranked in importance according to the relative strength of statistical evidence (Figure 8). 2) worst case. Once the simulation is complete. the simulation automatically produces a sensitivity chart that resembles a spreadsheet bar graph. a much different picture emerged. personal opinion and a consensus are the only evidence required for making a decision on the decision to pursue the NanoTech Widget. 2003 .000 times in under six seconds using the data matrix introduced in the Five-Minute PhD. NanoTech Widgets solve problems. Sensitivity charts expose counter-intuitive patterns that are masked by spreadsheet arithmetic. Vector analysis sensitivity charts rank Profit Signals according to the strength of the evidence for each factor. A sensitivity analysis ensures that management focuses on the key variables that have the most impact. All Rights Reserved. they are a sure fire new product.

and factor interactions. We have seen simulations effectively tackle budgets with up to 77 variables. All Rights Reserved. Boyles.9% chance of breaking even with the NanoTech Widget. Daniel Sloan and Russell A. standard deviation. Simulations and legitimate financial forecasts are standards in Six Sigma breakthrough projects. there is a 77. The average.4 million dollar loss highlighted at the left side of the forecast. The level of thoughtfulness this tool creates is well worth the time investment required. and analytic graphs are ignored. Forecasting spreadsheet analysts are simply expected to correctly guess all values. There is only about a 50/50 chance of making the projected $9. evidence strength. 2003 .Evidence-based Six Sigma 105 Table 2 The classic old school budget forecast for new product development presents assumption numbers without the benefit of either context or evidence. © M. Figure 7 Based on all the actual data at hand. The most probable outcome is the $14. p-value.2 MM.

Though nested boxes and Russian dolls are not official Six Sigma tools. Each three-dimensional replica must be produced using fewer resources and a higher degree of precision. So it is with Six Sigma process maps. A good process map is as multidimensional as a set of nested Chinese Boxes or Russian dolls. The outermost Russian doll is called a Matreshka or grandmother. are universal communication tools.106 Evidence-based Six Sigma Figure 8 Success in launching with the NanoTech Widget product depends on the company’s ability to penetrate the market.” It is no accident that a process maps are the first-choice tool taken down from the shelf after a lucrative new project has been selected. 2003 . © M. from Babylonian clay tablet cartography to downloadable Internet driving directions. Daniel Sloan and Russell A. Since maps have proven their value they too are a “Six Sigma Tool. All Rights Reserved. Process Maps Maps. these analogies encourage people to look more deeply than a surface appearance. Succeeding generations are contained within her.17 Miniaturized generations are refined replicas. Boyles.

Black Belts are expected to map the nested dimensions of a work process in about a week. 2003 . These maps are impressive. first hand observations. simple picture. geometry. The ‘Six Sigma Matreshka’ in Figure 9 is a Suppliers. and Customers map or SIPOC for short. ��������� ������ ��������� ������� ��������� ���������� ��������� ������� ����������� �������� �������������������� ��������������� ������ �� ����������� ������� ���� ���� ������������ ���������� ������� ����� ���� ������� ������������ ���������� ������������� ����������� ������ �������� �������� �������� �������� © M. these are simplified and distilled into diagrams that illustrate and endorse only essential elements that pull the system forward efficiently. In the same way that the Space Shuttle Radar Topology Mission used vectors. Drawing these maps from the end to the beginning is the best way to produce a meaningful SIPOC map showing all the relationships. Daniel Sloan and Russell A. These loops are not illustrated here in order to present a clean. Inputs. and computing power to map 80 percent of the earth’s landmass in only 10 day’s time. Processes. It is assumed that this system has feedback loops throughout. Boyles. Over time.Evidence-based Six Sigma 107 First blush drawings can span pages. Outputs. and measurements to complete this map. This is one of the skills that is worth the practice time investment. Practice makes perfect. All Rights Reserved. Figure 9 Black Belts use personal interviews.

are drawn in an old fashioned. called lean process maps. working time minus breaks. Lean metrics make sense. Daniel Sloan and Russell A. They include uptime. Suffice it to say lean maps document the entire value stream. Mapping is a documentation discipline that is rewarding and informative. bottleneck or constraint slows process flow. This Six Sigma drawing describes the “hidden-factory. ������� ����� ���������� ������� ����� ��������� ��������� �� ��������� ��������� ����������������� ������������ �������������������� ����������� ���������� Figure 10 The hidden factory of rework in this map includes Processes 4-6 and the related delay. 2003 . Hidden factory maps are often posted in conspicuous places. lowtech way using paper and pencils. ��������� � ���������� ��������� ��������� �������� ������� ������� ��� �������� ��������� ��������� ��������� ��������� Waste and rework plague every production and service delivery process. changeover time (C/ © M. and has.108 Evidence-based Six Sigma Invariably. the hidden factory always makes an appearance. 2) a delay. its own literature. every Six Sigma map uncovers Figure 10. All Rights Reserved. cycle time (C/T). They track information and material flows from start to finish through an organization. Lean maps have a lexicon and icon system that is worth studying.18 Lean is a separate business tool that deserves. Management responsibilities are visible and a host of snapshot measurements are recorded. Just as the start of every football season is marked by Lucy tricking Charlie into trusting her for the inevitable betrayal. or 3) a barrier stops production altogether. from the boardroom to the individual work space on the factory floor.” Its appearance has a Charlie Brown and Lucy Van Pelt quality to it. Boyles. They always happen where: 1) a loop reverses the forward motion of the product. The most sophisticated ones.

are called Y’s. Rightfully so. and the scrap rate. Figure 11. 2003 . shows how these factors become a series of hypotheses in a data matrix. and seconds. a plan for every part (PFEP). They work. With the lean Six Sigma strategy in place a second can be. minutes. Daniel Sloan and Russell A. in one San Jose Internet router factory a 2 foot by 2 foot by2 foot pile of scraped motherboards was time-valued at more than $6 million dollars. Profits and losses. These maps help people identify process factors.” Their record of extremely profitable achievements began in the 1950s Toyota production system and continues to this day. or the impact of variation on measurement. Takt time. The ‘timeis-money-money-is-time’ theme dominates lean thinking. Once the matrix is filled with measurements. breakthrough projects have focused on eliminating these expenses. lean maps record production batch sizes for every product interval (EPE). Lean measurements and flow mapping earned their way into Six Sigma the old fashioned way. the existence of gravity. and often is. “Hidden Factory” costs. hours. Times are recorded in days. the dollar value of a process outcome. In addition. the number of product or service variations. value added time (VA). To do so would be as foolish as arguing against the speed of light. and production lead time (PLT). a vector analysis will point out strong and weak Profit Signals with objective standards of evidence. each time has an exceptionally specific operational definition. that may be driving the system toward profits or losses. © M. those who are familiar with lean tools do not argue against them. inventory turns. For example. or the costs of waste and rework are called the Costs of Poor Quality (COPQ) Since the 1950s.19 Like a vector analysis. literally worth thousands of dollars. the numbers of operators. known as the X’s. Boyles. A thought map. In addition to its own acronym. First-In-First-Out (FIFO). All Rights Reserved. This is why lean maps are a “Six Sigma Tool.Evidence-based Six Sigma 109 O).

A German U-boat’s torpedo sank this plan in 1940. In concrete terms. Shewhart symbolically placed on the corners of Fisher’s work 80 years ago are today’s Six Sigma costs of quality. 2003 . He was also the accomplished statistician. 1924.110 Evidence-based Six Sigma ������������������������� ������������������������ ������������������ ������� Figure 11 Maps help improvement teams identify variables that will be subjected to a 3D vector analysis.21 Shewhart was a physicist. ����������� � �������� ��������� ����������� � �������� �������� ��������� ��� ���� ���������� ����������� � �������� ��������� ���������� ��� ������ �������� ������� �������� ���������� ��� ���� ���������� The Costs of Poor Quality The origins of the “Costs of Poor Quality” idea can be traced to Walter Shewhart’s invention of the quality control chart on May 16. Boyles. friend.23 It is worth noting that this development ultimately can be linked to Jack Welch’s 1990’s Six Sigma initiative. Feigenbaum is credited with developing the first dollar based.20 The quality control chart is yet another way of graphically viewing a vector analysis. All Rights Reserved. Daniel Sloan and Russell A. quality reporting system while working at General Electric in the early 1950s. Armand V.22 The dollar figures Dr. Why are the costs of poor quality so important to Six Sigma breakthrough projects? Simple. and colleague of Ronald Fisher who volunteered to care for Fisher’s six children during World War II. © M.

Internal failure costs are hidden factory expenses that remain invisible to customers. Scott Erickson. and quality assurance system costs. Figure 12 Prevention and appraisal investments. One dollar in newly earned revenue can produce as little as one penny in new earnings. are relatively static. Daniel Sloan and Russell A. analytic software. Since Feigenbaum first created this classification system. a 10:1 return. For Six Sigma these costs include training. planning. vendor certification using Six Sigma quality metric standards. Each of four poor-quality cost categories can be leveraged. prevention investments have been expected to produce. often referred to as costs. saving major dollars is like shooting fish in a barrel. to persuade senior management to embrace evidencebased decisions. 2003 . External failures are mistakes and errors that are highly visible to the customer. If © M. inspections. Appraisal investments include quality audits. and have produced. education. maintenance. Information systems (IS) designed using principles of evidence-based decisions are far less expensive than those that are not. One dollar saved through the elimination of waste and rework drops to the bottom line as one dollar. once a Black Belt gets the hang of the breakthrough project system. All Rights Reserved. and information systems. Boyles.Evidence-based Six Sigma 111 Revenues are taxed. Figure 12 Textbook example of a Cost of Poor Quality (COPQ) flow chart used by a Black Belt engineer. To a certain extent. testing.

Only processes that are capable of producing perfection do produce this level of quality. and related equipment required to rework products must all be tallied up. engineering changes. I would add in the same vein: when a scientist makes a mistake in the use of statistical theory. Delivering perfect quality services 100 percent of the time is a powerful a business strategy. Daniel Sloan and Russell A. complaint handling. returns. Perfect processes can and do produce virtually perfect outcomes. warranty costs. The vast majority of IS systems are modeled after spreadsheets. it becomes law. This question invariably raises eyebrows. Boyles. Failure Mode Effects Analysis (FMEA). when a judge makes a mistake. marketing and sales errors. Transforming this system.”24 The best way to protect any business from external failure costs is to produce a perfect quality product every time a product is produced. is the analytic measure used in graphic presentations of evidence documenting perfect quality. The easiest way to learn if your system meets these standards is to ask your IS department to show you their data matrices and cube experiment arrays. known as Cpk. All Rights Reserved. the payback is spectacular. A process capability index. Finally. Retests. Corrective And Preventive Actions (CAPA) and productivity losses are recorded here. In our increasingly litigious society it is almost impossible to overstate the costs of failure. woe unto him for he is sure to be found out and get into trouble. Once the investment is negotiated. 2003 . we encourage you to create an IS strategy that is based on sound geometric principles. it becomes part of ‘scientific law’. Internal failure costs include all scrap and rework. Liability suits. Shewhart wrote humorously about the reality of 1939 quality costs. “I am reminded of the old saying: when a doctor makes a mistake he buries it.112 Evidence-based Six Sigma your company is looking for a place to begin Six Sigma. entails rework costs. but when an industrial statistician makes a mistake. © M. or transferring the information in it to a data matrix. excess inventory costs. external failure costs are problems that land in the customer’s lap.

Each of us has personally rolled dice more than 5. Skeptics complain. Comparing both methods will give you a good feel for the value of Six Sigma vector analysis software. Note how tightly the distribution curve is centered on the target of 7. you can accurately simulate the outcome of 5. With the click of a mouse button. Six Sigma companies earn “high” Cpk values not by lowering their standards. In this way our teaching analogy is flawed. “See. © M. Let’s game this system and improve our Cpk by setting our LSL and USL perfection expectations at –10 and 30. For this example. but by raising them relentlessly. If and when the tails of our curve fall above and/or below our perfection specifications. The Cpk value is calculated by taking that old favorite. We chose the simulation method for this example.99999 percent of the time! We’re in the money.” These complaints are ludicrous. (Chapter 7 will extend this experiment to include 4 die. it calculates and estimates statistical limits for a distribution as if it were not constrained. geometrically.000 times. All Rights Reserved. Statistical software does not know that the outcome of throwing a pair of die is constrained to the range 2 to 12. 2003 . Boyles. Daniel Sloan and Russell A. In real life. you can see we set our Lower Specification Limit (LSL) for perfection at 2 and our Upper Specification Limit (USL) of perfection at 12. and dividing it into the spread of the data.000 measurements. these portions would be scrap and rework. σ. Figure 14 shows that our process is a smoking Six Sigma process fully capable of producing perfect quality outcomes 99. Ignore them.Evidence-based Six Sigma Process Capability 113 We will use dice rolls for our example. Therefore. by now you get the point. and exponentially.) Feel free to roll your set of dice until you get 5. Figure 13’s process capability curve tells us our process is not capable of producing perfection. Statistics lie! They can’t even handle dice rolling. And.000 rolls in a minute using software. The point in this exercise is a principle. Or. software graphs our data and tells us how capable it is. A Six Sigma process yields a Cpk of 2 or more.

Daniel Sloan and Russell A.114 Evidence-based Six Sigma ������������������� ���� �� ������� ��� ������������� ���������������� Figure 13 This process is not capable of perfection. ������������������� Figure 14 A Six Sigma process will produce perfection every time. All Rights Reserved. The only way perfection can be pursued and achieved is by using quantitative measurements and analysis. Boyles. The only set of tools that makes this rate of improvement possible is the scientific method and the geometry of a vector analysis. This is why there is such a bandwagon rolling with Six Sigma Breakthrough Projects and 6σ tools. they use a vector analysis applied to a data matrix. ���� �� ������� ��� ������������� ���������������� ����� �� ������� �������� ����� ���� �������� ��������� ��������������� ����������� �� ����������� ��� ���� ��� ��� � �� ��� © M. ����� �� ������� �������� ����� ���� �������� ��������� �������� ��������������� ����������� ���������� ��� ��������� ��� ��� � ���� ��� ��� �� To achieve these levels of perfection. 2003 .640. This is why Six Sigma is not a fad. Its Cpk value is only 0.

6 Our bell curve illustrations were inspired by a drawing originally produced by Control Engineering Online. Out of the Crisis. Revolution in Science. and Model Building. Boyles. Center for Advanced Engineering Study. 1982. 8 As a sidebar note. Massachusetts Institute of Technology. Pages 170. W. J. Belknap Press of Harvard University Press. 1985. Belknap Press of Harvard University Press.com/ http://www. Data Analysis. 1 The body of knowledge that is widely regarded as the most comprehensive is posted by the American Society for Quality http://www. John Wiley & Sons.org/cert/types/sixsigma/bok.com/ Their on-line Six Sigma Black Belt course is interesting and informative. Page 96. Walter.html 2 Mikel Harry. Inc. 3 Shewhart. Daniel Sloan and Russell A. 1985.com/ © M. Inc. reported this history on a video tape recorded in 1995.fenews. 7 Deming. William G. 5 Cohen. Statistics for Experimenters. Bernard. 4 Cohen.Evidence-based Six Sigma Endnotes 115 Box. New York. Stuart. Hunter. D.com. J. 1931. Edwards. 2003 12 . George E.201. 10 11 http://www. 9 Inspiration for this particular grid came from Moresteam. Cambridge. Revolution in Science..P. Van Nostrand Company. Economic Control of Quality of Manufactured Product.asq. An Introduction to Design. Page 96. a popular leader in the Six Sigma field. All Rights Reserved. 1978. page 5. Brooklyn. http://moresteam. Cambridge.. J. Cambridge. it is interesting to know that Gantt patented a number of devices in collaboration with Frederick Taylor when they worked together at the Bethlehem Steel Mill on Taylor’s Scientific Management theory. Hunter.processmodel. Bernard.

com This spreadsheet is used with permission along with the flow diagram for financial models. Fads and Fallacies in the Name of Science. Statistical Method from the Viewpoint of Quality Control. 24 © M. 1978. Life of a Scientist.org Womack.com The numbers and layout of this budget come from Decisioneering’s tutorial example.decisioneering.nestingdolls4u. R. 1990.116 Evidence-based Six Sigma Gardner. 1987. James P. New York. Poor Quality Cost.decisioneering. 2003 . New York. 1986..com/history/history. 19 Harrington. Page 184. H. Inc. Rawson Associates Scribner Simon and Shuster.decisioneering. Walter A. Martin. New York. page v. ClearVision. 23 Shewhart. Inc. The Machine that Changed the World. Dover. Inc. page xiv. 13 14 http://www. Page 40. 21 Box. New York. James. Poor Quality Cost. Boyles. Marcel Dekker. Marcel Dekker. Daniel Sloan and Russell A. 20 Shewhart. Walter A. A. Daniel.. John Wiley and Sons. Fisher. and Roos. H. New York. Van Nostrand and Company. Daniel T. Economic Control of Quality of Manufactured Product. Page 377. All Rights Reserved.htm 17 18 http://lean. New York. 22 Harrington. 15 http://www. 1931. 16 An interesting history of this symbolism can be found at http://www. Joan Fisher. James. Jones. 1952. 1987. Dover Publications. D. New York.com http://www.

Finally. vexing journalism quandary. I have to keep telling myself that evidence-based decisions can and will prevent me from repeating history.” But no one.Chapter 4 Case Studies C ase studies needed to meet four criteria. including the magician. Recounting a Six Sigma project victory is like explaining a magic trick. representative sampling of what we each have repeatedly seen over the past 20 years of our professional life. Though we tried. We decided to tell them the way clients tell them. places. each story had to be true. Each example also needed to graphically explain how evidence-based decisions produced crowd pleasing financial returns. can do it without knowing how. maybe I just took them personally. It had to be entertaining. They hit too close to home. Daniel Sloan and Russell A. someone in the audience thinks. the story needed to be a fair. and data were altered to protect privacy. All Rights Reserved. the stories in this chapter trouble some managers. Once a wizard’s secret is revealed. As a senior executive. “Shoot! I could have done that. © M. We bit this bullet and chose to include them. “The stories in this chapter upset me.” Case studies are essential to understanding. One senior executive reviewer echoed Daniel Sloan’s own 1986 Vice President of Marketing’s sentiments. Boyles. Though names. we were unable to completely resolve this perplexing. It is difficult for me to keep reminding myself that what is past is past. Occasionally. 2003 .

damn lies and Statistics” was a reference to this problem. confidentiality is used to justify secrecy. and all theories are infested with moral and political doctrines… Therefore. No analysis method can deliver us from the unethical corruption of reported data. it is also true that data can be suppressed. symbolizes ‘solid evidence’. Transparency. The cornerstone of evidence. It is contradicted by the documented successes of the evidence-based decisions that power Six Sigma breakthroughs. ‘massaged’ or just plain falsified. Still. It is transparent.118 Case Studies Magic tricks are illusion. This position was summarized in the March 1998 issue of Discovery Magazine: “Anybody who claims to have objective knowledge about anything is trying to control and dominate the rest of us…There are no objective facts. Evidence-based decisions put real dollars in real banks. Evidence is a funny thing. However. Then both are tarred with a brush of cynicism.…he must have a political agenda up his starched white sleeve. It is no surprise that spreadsheets have © M. This human tendency creates a resistance to transparent reporting systems in business and government. given good data in a data matrix. All aspects are revealed. They are also the characteristics that some find most disturbing about Six Sigma.” 1 This “know-nothing” doctrine stems in part from inadequate science and mathematics education. Daniel Sloan and Russell A. Many of us are interested in evidence only when it confirms an existing belief or policy. 2003 . Boyles. full disclosure and international standards for data analysis are the reasons Six Sigma works. All supposed ‘facts’ are contaminated with theories. Another human tendency is to equate evidence with authority. vector analysis makes it virtually impossible to misrepresent the information in that data. It uncompromisingly tells the truth. Confidentiality is necessary in business and government. All Rights Reserved. a tetrahedron. when some guy in a lab coat tells you that such and such is an objective fact. Disraeli’s comment “Lies. Vector analysis is based on immutable Laws of the Universe. Too often.

Naturally. Spreadsheets snap tightly to the New Age mantra. In this sense. people tend to construct stories that favor their point of view. we use forthright honesty. Customer Service – Governmental Agency Political pressure was forcing a Washington State government department to improve the quality of their services or face the loss of $500. software. honesty and misrepresentation. and using fewer resources. computers.Case Studies 119 sensational appeal. These methods are inherently one-dimensional.2 Transparency and secrecy. 2003 . None of them recognize the essential Profit Signal and Noise vectors. and the New Management Equation to dispel the mystery surrounding evidence-based decisions. Everyone wants to make more money in less time. it is easy to construct any story that is consistent with any one vector. The department’s Executive Director gave employees the opportunity to choose a consultant to help them in their efforts to maintain current funding levels. with less work. A Five-Minute © M. All Rights Reserved. Once people harvest Six Sigma profits by making better decisions. Each uses only one of the six vectors in the cornerstone of evidence. graphics. are equally weighted options. As Master Black Belt teachers. the cornerstone of evidence. Daniel Sloan and Russell A. Because break-even thinking and cost accounting variance analysis allow management to ignore five of the six reality vectors. Doing more of what works is a doctrine to embrace. “Tell your own truth. and they can analyze it any way they want. Knowledge and reliable information start the Six Sigma DMAIC ball rolling. Boyles. It leaves the trialand-error methods of old-school management in the dust. cost-accounting variance analysis suppresses five-sixths—83 percent—of the information needed for an evidence-based decision.00 in funding as a penalty.” New Age know-nothings can structure data any way they want. Spreadsheets are the engines for the cost-accounting variance analysis. objections diminish.

we promised to help them present their evidence. • Hypothesis 2 (H2):The time of day makes the difference. The agency’s executive director knew calls went unanswered. There were a number of suspected causes. Define: Jobs. Answering machines are prohibited because they symbolize poor quality service. We just do what we are told. One line was busier than the other one. Armed with her own good judgment. Poor customer satisfaction had put this department on the legislative target hit list. including management jobs. 2003 . Daniel Sloan and Russell A. These included: • Hypothesis 1 (H1): The day of the week makes a difference. the team of secretaries constructed a check sheet to record matches with the cube experiment data matrix (Table 1). Some days are busier than others. Measure: The team of secretaries who answered the phones claimed they had a good solution to the problem. were on the line. “We can’t get anyone to listen to us. To begin the project. flow diagrams and a Pareto analysis exposed breakthrough improvement opportunities. Boyles. Several full time clerical staff answered phones that literally rang off the hook. for the unanswered phone flash point. © M.” The ringing phones were right outside her office. All Rights Reserved. “All phones will be answered by the third ring. A specific criticism concerning this department’s performance had to do with the way it answered its telephones. and she monitored her policy. One-half million dollars in legislative funding was at stake. Using a paper and pencil.” We suggested that they use a check sheet to collect data. she had instituted a department policy by edict. •Hypothesis 3 (H3): The telephone line made the difference. Some times are busier than others. Negative regional news coverage over departmental problems made state citizens angry. or hypotheses.120 Case Studies PhD demonstration and evidence-based decision tools attracted their attention.

2003 . were an order of magnitude larger than the number of calls on line 1. It would have been expensive to fix. The main effect was so obvious everyone could see it immediately just by looking at the matrix. the front face of the cube. the back face of the cube. The numbers of calls on line 2. This proofreading error was embarrassing. No one had the courage to bring it to the attention of the agency’s executive director. Figure 1 presents the data in a cube plot. All Rights Reserved. No other variable had an effect. Daniel Sloan and Russell A. ������� ������� ������ ������ ���� ������� ���� ������� �� � ���� ������ ����� ��� ������ ��� ������ Line two was sending a clear profit signal. It turned out that line two had been listed incorrectly in telephone directories across the entire state. Hypothesis 3 was the “big hitter”. ��������� ��������� Figure 1 All of the high numbers fell on the back plane. Boyles. © M. Analyze: The data matrix revealed a distinct profit signal.Case Studies 121 Table 1 The cube experiment data matrix guided the collection of data recorded by hand on a check sheet.

Eventually one full time position was eliminated through attrition for a bottom line savings of more than $25.000 per year in cash flow. a telephone answering machine was purchased and installed. Rather than addressing the core issue. The analysis and presentation took one hour. or $420. Boyles. They became telephone operators and gave dialing assistance to callers. Define: For more than a year debate had raged over what could be done to reduce AR days. The number of days in AR ranged from 35 to 110 days. 2003 . Six Sigma accuracy. combined with avoiding the loss of funding. This breakthrough played a role in persuading legislators to sustain funding at existing levels. in at least this case. The total time to collect data for the data matrix was five days. brought the total value of the project to $525K. We took their evidence forward with a firm conviction that. Suspected causes for this © M. Days in Accounts Receivable A service company needed to reduce its number of days in accounts receivable.000. The answering message announced the Yellow Pages error and then the correct number to callers. or AR days. A breakthrough improvement project could yield as much as 35K per month. Daniel Sloan and Russell A. the workforce decided it was much easier to keep their heads down. The executive director gave this improvement her blessing with a belly laugh. Improve: One hour after the presentation of our evidence. This.122 Case Studies The executive director’s edict compounded the fear factor. All Rights Reserved. the messenger would not be shot so early in a consulting engagement. Control: Telephone listing corrections were made the following year. A breakthrough in the proofreading process ensured 100 per cent. Six secretaries and other workers could now focus their attention on real work.

The Chief Financial Officer of this company was committed to keeping productive hours in line and on budget. “Spreadsheets work fine. and there were hundreds of them. Good customers pay fast. The fewer the visits the smaller the AR days. the larger the number of AR days. All Rights Reserved. The number of visits made by a salesman to the customer is key. Workers in his department were required to do their jobs. Measure: Significant AR data had been collected. • Hypothesis 4 (H4): The longevity of our customer relationship makes the biggest difference. in order to keep operating costs low. The more visits. Each customer. A regular work schedule would be continued. • Hypothesis 5 (H5): The number of services provided determines the number of AR days. More services create complexity. as well as to work on breakthrough projects. • Hypothesis 3 (H3): The customer is the main reason for long or short AR days. or straw men hypotheses. Boyles. Good managers have short AR days. 2003 . Daniel Sloan and Russell A. included the following: • Hypothesis 1 (H1): Management is the solution. © M. These records were stored in file cabinets. Bad managers have lots of days in AR. no statistical software would be purchased. No overtime would be paid for improvement tasks. It was with a great deal of pride that the accounting team showed bills were filed in near perfect chronological order. Billing complexity slows payment.Case Studies 123 problem varied.” We interviewed every employee and constructed process flow diagrams. Poor customers pay slowly. We identified five important variables that might affect AR days. These suspicions. Moreover. • Hypothesis 2 (H2): Sales calls are the answer. had its own manila folder. Long-term customers pay more slowly because they know our business depends on them.

Daniel Sloan and Russell A. They knew that if they found an answer. we used our own statistical software to create an optimal data matrix for five independent variables at two levels each (Figure 2). Collecting data that fits the profile of each run is another. These two front line leaders wanted to find out what combination of variables actually made a difference. It is virtually impossible with a spreadsheet. This took all of five minutes. Going to Plan B. The CFO had vetoed a recent budget request for a PC workstation and relational database software. 2003 . Their daily workload was so challenging. Boyles. Creating a data matrix is one thing. they simply didn’t have time to array any more spreadsheet data than they already were doing for the CFO during the regular workday. so automatic queries and data mining were out of the question. A billing clerk and a billing manager volunteered to come in over the weekend and pull records. it would be valuable. All Rights Reserved. They believed they were familiar enough with customer profiles that they would be able to find bills that would match each of the 32 different “runs” in the matrix. © M.124 Case Studies Figure 2 Statistical software automatically determines the best data matrix geometry for a vector analysis involving five independent variables.

The key difference between the two customers was widely known.05 imply a 95 percent level of confidence or more in the results. New customers were able to bill electronically. Customer and Relationship. the Chief Financial Officer had refused to approve the purchase of a $15. We could say with better than 99. The two factors. Analyze: Three strong profit signals emerged from the vector analysis we applied to their data matrix. All Rights Reserved. Customer B was billed manually. The CFO openly opposed the use of statistics.” A year earlier. They still do.999 percent level of confidence that the customer was a main effect. Customer A was billed electronically. The computerized vector analysis also showed.Case Studies 125 Figure 2 shows the first 28 rows of the data matrix with the number of AR days visible in the far right hand response measure column. and their interactive effect. The every-other-row pattern of a short number of days in AR followed by a long number of AR days was evident immediately to the accounting department workers on a Sunday morning. The company’s Chief Financial Officer and Chief Executive Officer disliked computers. the length of the customer relationship was another active factor that influenced the number of days in AR. with a 99% level of confidence. The main effects were controversial. Boyles. The CEO had excused the finance department from participation in breakthrough projects “until the data matrix and vector analysis tools proved to be useful.000 PC workstation © M. were statistically significant at this confidence level. Old customers were not. Daniel Sloan and Russell A. ��������� ��� ������ �������� ������������ �������� ������������ ������� ��������������������� ����������������� ��������������������� ��������������������� ������������������������� ��������������������� ���������������� �������������������� ���������������� �������������������� ����� ���� ���� ���� ���� ���� ���� ���� ���� ���� ���� ���� ���� ���� ���� ���� �� ���� ���� ���� ���� ���� ���� ���� ���� ���� ���� ���� ���� ���� ���� ���� ������ �������� ���������� ���������� ���������� ���������� ���������� ���������� ���������� ���������� ���������� ���������� ���������� ���������� ���������� ���������� ���������� ������� �������� �������� �������� �������� �������� �������� �������� �������� �������� �������� �������� �������� �������� �������� �������� �������� �������� �������� �������� �������� �������� �������� �������� �������� �������� �������� �������� �������� �������� �������� �������� Figure 3 P-values less than 0. Anxiety filled the air. The p-values in Figure 3 appear under the heading “Prob > F”. 2003 .

Figure 4 presents accurate AR day predictions for differing combinations of all five factors. Daniel Sloan and Russell A. All Rights Reserved. Following the presentation.126 Case Studies in his department to keep costs down. five-factor. The electronic billing and relational data base topics were verboten. Note that both of the top cubes have shorter AR days. 25. Figure 4 The two-level. A $1. 2003 . When the AR Days come from customer A and a new relationship. Improve: The team spent a week gathering its courage and preparing evidence for a presentation to senior management.” Figure 4 explains part of the reason that executive resistance to evidence-based decisions continued. This particular finance department found 3D cube graphs to be upsetting. ��������� ���������������� ����������� ������ ������ ��������������������������� ����� ������ ����� ������ �� ������ ������ �� ������������ ������������ ����� ������ ���� ������ ����� ���� �������� ����� �������� ����� ������ ���� ���� ���� ������ ������� ��� ���� ���� ������� ��� ���� ���������������� ����������� ����� ������ ��������������������������� ������ ������ ������ ����� ������ ���� ���� ������ ������������ ������������ ������ ����� ���� ����� ������ ���� �������� ����� �������� ����� ���� ������ ���� ������ �� ���� ���� ������� ��� ���� ������� ��� © M. AR days are lower than with any other combination of factors. Boyles.000 request to purchase data matrix software for the finance and accounting department was denied. 3D vector analysis pictures do not look like bar graphs or pie charts. vector analysis compares all the factor interactions using the traditional 3D cube. “Spreadsheets work fine. the company purchased and installed a top-of-the-line workstation.

Boyles.000 cash flow gain in the first year.” This experience taught us to present profit signals using a special kind of a bar graph known as a Pareto Chart rather than the more powerful cube.Case Studies 127 Control: Results produced by lowering days in AR by 30 days exceeded the projected $420. so did the use of evidence-based decisions. The heads up improvement team put their heads down and went back to work. which rank orders profit signals from strong to weak. As the financial crisis passed. 2003 . The Pareto chart in Figure 5. © M. Daniel Sloan and Russell A. All Rights Reserved. the company has refused to invest in either the education of its finance and accounting workforce. Total time required to complete project was 90 days. To this day. or the purchase of data matrix software. gives customers what they want in a way that poses no visual threat. People just want the answer. Figure 5 Profit signals are easy to spot using a graph that ranks vectors from strong to weak. “Spreadsheets work fine.

Boyles. we are simply overwhelmed. The nearby ED physician gave her an apologetic smile and said. The list of likely causes includes over crowding. an aging population.” The following Monday. a shortage of nurses. We are on divert status. after reviewing the hospital’s admission data and top revenue producing departments. CEO administrator asked. The ED Medical Director. a shortage of inpatient and/or long term care beds. ED and Critical Care nurse managers. imaging.” In response.” The RN. “What are the standards of evidence you use when you decide to close the ED?” The Charge Nurse responded. JCAHO has instituted new Emergency Department Overcrowding Standards requiring a hospital’s serious attention. selected the ED as the hospital’s initial Six Sigma Project on her first day of work. CEO administrator. It often accounts for a significant percentage of all admissions. “Our ED is closed to ambulances.128 Case Studies Breaking the Time Barrier3 Long waits in hospital emergency departments are legendary. The Emergency Department is the front door of a hospital. We cannot provide safe care if one more sick patient comes through those double doors. the Black Belt RN. Service excellence that meets or exceeds the public’s expectations is essential. She saw vacant treatment rooms. All Rights Reserved. directors of the laboratory. Three staff members were cautiously watching her from behind the nurses’ station. The newly hired administrator of a community hospital. “Well. a certified Six Sigma Black Belt. the administrator called a meeting to discuss the “ED Divert” issue. 2003 . “This happens all the time. a 30-year veteran with a Masters of Public Health administration degree. You might as well get used to it. glanced around. environmental © M. and a saturated primary care system. The ED Charge Nurse had told her. They were all pretending to be charting. Daniel Sloan and Russell A. The Joint Commission on Accreditation of Health Care Organizations (JCAHO) has recognized the critical nature of Emergency Department (ED) overcrowding.

The general theme was that Emergency Department diversions were caused. They ran it and analyzed their data. No amount of effort could reduce ED divert time. from Admitting to X-ray. 8 hour shifts. and the Emergency Medical Treatment (EMT) director from the local fire department responsible for the paramedics all attended. or two or three. This totaled eleven. Boyles. The Team began planning steps in the Six Sigma DMAIC breakthrough process.” “If you want us to put our nursing licenses at risk. “Every hospital in the city is having the same problem. They designed an experiment. No one in the ED was familiar with Six Sigma techniques or tools. Daniel Sloan and Russell A. Why at Mid-Valley.Case Studies services. It was an inevitable result of growing volumes. They cost the hospital hundreds of thousands of dollars in potential revenue. © M. Those closures penalized patients. 2003 . . well . During the past six months.” As they reviewed the actual numbers from data that had been collected and arrayed in a data matrix. . Every member had her or his own favorite reason. or three and two-thirds 24 hour days. This is a crucial DMAIC first step no matter what the project is. 129 Everyone was resigned to the status quo. the Emergency Department had been closed or on diversion (divert) more than 5300 minutes/month. The list of suspected reasons for going on diversion status were as varied as the professional team that sat around the table. their ED is closed twice as much as ours. The Black Belt CEO led a brainstorming process to identify Critical To Quality (CTQ) factors. that they firmly believed was the primary cause of ED divert.” “The CT tech takes call from home after midnight. We’re always waiting for her to come in. by “other” departments in the hospital. . in large part. Nevertheless they began to wrestle with a complex process that involved most of the hospital’s departments. All Rights Reserved.” “If the Cath Lab crew was in-house 24/7. or 12% of available time. why that would solve the problem. they were surprised. .” “There are never any beds in the ICU. .

DMAIC DMAIC is the standard Six Sigma breakthrough project methodology.130 Case Studies Break Through Results The profit signal vector analysis showed that once a decision to admit a patient was made. Once this practice was halted. In the first two months their initial project results list included: • The average hours on ED Divert dropped from 88 to 50 per month. The admission process simply slowed to a near stop. • A 38.6%. • The average ED Length Of Stay (LOS) shrank from 3. Daniel Sloan and Russell A. everything changed. • The number of Emergency Department visits increased by 12.9 hours. sustainable results. All Rights Reserved. a 48% reduction from the same period the prior year. The department was astonished. performance pressure was off. Issues identified included time on ambulance divert. Any amount of wait time could be rationalized. measurable. Define issues systematically.6 hours to 1. An “acceptable” wait time for a patient being admitted was openended.26% increase in Emergency Department gross margin was generated. • Catheterization Lab time dropped from 93 minutes to 10 minutes. • Patient satisfaction increased from 59% to 65%. 2003 . The application of DMAIC to this hospital’s ED overcrowding and diversion problem produced dramatic.6%. • Intensive Care Unit bed availability increased by 10. © M. statistically and practically. Boyles.

JMP 5. The Black Belt Administrator. All Rights Reserved. Wisdom Gained Along The Way The knowledge experts on the ED Six Sigma team used inductive reasoning. Everyone felt that all other issues would improve if LOS could be reduced. The team established performance measures and targeted benchmark targets for each goal. patients leaving without treatment (LWOT). Boyles. They prioritized reducing (minimizing) Emergency Department Length of Stay (LOS) as the key response. The data were gathered in less than 24 hours (Figure 6). Analyze data using a vector analysis applied to a data matrix. created an 8-Factor Designed Experiment on her laptop. Control the process to insure that break-through improvements were sustained. CTQ variables identified for evaluation were the patient’s gender. Evaluating these 8 factors required only 16 runs to complete. models. They identified potential Critical to Quality (CTQ) variables they believed influenced the department’s Length of Stay (LOS). diagrams. 2003 . a decision to admit or not. and process flow diagrams. Door-todoor Length of Stay (LOS). Left Without Being Treated (LWOT) as a proportion of all patients. Identify and array CTQ factors. Daniel Sloan and Russell A.0. observe the process and begin data mining. Collect data. and patient satisfaction levels were identified as the CTQ factors the team wanted to study. Prepare appropriate quality control charts and Design of Experiments (DOE) to determine CTQ factors. © M. Improve the process using evidence-based decisions to power Six Sigma breakthroughs. a “slow” and “fast” physician or nurse (identified by employee number).Case Studies 131 unacceptable ED patient length of stay (LOS). Measure using maps. using data matrix software. Ready availability of an ICU bed. and considerable lost revenue. low patient and staff satisfaction levels. laboratory and imaging testing were other variables.

” even if you are an expert. Before the project. ED staff and physicians ranked laboratory turnaround time as the most significant CTQ factor that influenced the length of stay in the Emergency Department.” Don’t trust your assumptions or your “gut. This confidence level helped managers make critical decisions quickly. Six Sigma techniques. two level.132 Case Studies Figure 6 Custom design for an 8 factor. All Rights Reserved. This was one of many “ah ha’s. Boyles. including a carefully designed experiment and rigorous data analysis (computerized software makes it easy) provided evidence at the 95% level of confidence. The results of the Designed Experiment (DOE) were surprising to members of the Six Sigma Project team (Figure 7). Emergency Department Length of Stay (LOS) Experiment. © M. Figure 7 Results of an 8-Factor Designed Experiment. CT technician availability ranked a close second. 2003 . Daniel Sloan and Russell A.

Nurses gained an appreciation of the unique and essential role each service provided to quality patient care. Nurse managers evaluated and resolved issues between their departments. They arranged schedules and provided time for nurses to “walk in the shoes” of nurses in the other department. It was. physicians) believed that “their” department worked harder and the “other” department was attempting to shift their workload to them. admission status and availability of an ICU Bed. Running a close second. All Rights Reserved. Those patients being admitted had a significantly longer Length of Stay than those who were treated and released. Both were at the 99. The lack of trust between the ED and the ICU required immediate attention. 2003 . “Us against them. Daniel Sloan and Russell A. to a lesser extent. she quickly uncovered an insidious attitude. the most frequent reason given for instituting the ED diversion status and closing it to customers was “No ICU Bed. While these CTQ factors.99% confidence level of significance. Boyles. ED and ICU service medical directors developed patient admission and transfer criteria that were approved by the © M.” Nurses (and. He voluntarily educated his staff so they would rely on the judgment of the ED staff. A drill-down of data revealed that less than 9 percent of the patients who entered the hospital’s ED by ambulance were ultimately admitted to the ICU.Case Studies 133 The Project Team discovered the CTQ factor with the most impact on ED LOS was admission status. When the administrator began discussing ICU bed availability with the nursing staffs in the ED and ICU. was the availability of an Intensive Care Unit bed. they were not at the outset of the project. Finding these two highly significant factors focused the team’s efforts. Attitudes quickly changed. Yet. The small percentage of admissions to ICU was a surprise to the EMT medical director. There was a related assumption that the EMTs expected an ICU bed to be immediately available or they would take the patient to another hospital.” Staff and physicians operated under the false assumption that “most” ambulance admits were very sick and “nearly all” would require an admission to ICU. may appear obvious now.

A flow process diagram revealed the problem. The ED nurse was required to provide a detailed report to the ICU nurse before she could initiate a patient transfer. including completion of lab work and a repeat EKG. While this time met national standards. ED staff were encouraged to rely on the EMT’s field diagnosis and initiate the call to the cath lab team as soon as the EMTs called in from the field. 2003 . All Rights Reserved. Working together. Communication between the two departments was difficult.134 Case Studies Medical Staff. authorized nurses to transfer patients out of ICU to open a bed for a new admission. Telephones might go unanswered in the ICU due to the immediacy of patient care needs. it was longer than the hospital’s nearby competitor. EMTs transported their most critical patients to the competitor hospital because of their superior door-to-cathlab-time. This is now an uncommon occurrence. An unintended but exciting result of the Six Sigma ED Project was a stunning reduction in “Door to Cath Lab” time. average door-to-cath-lab-time was a respectable 93 minutes. The criteria. Drill down analysis of outcome data revealed that the EMTs diagnosed MI with nearly 100% accuracy. (This is a measure of time from the patient’s arrival in the ED to the initiation of treatment in the cardiac catheterization lab. Patients in the field with a potential diagnosis of myocardial infarction (MI) were evaluated by the EMTs. based on a patient’s need for intensive nursing care. Boyles.) Before the ED project. The delay in calling in the cath lab team cost precious heart muscle-saving time. With the support of the administrator for the potential cost of additional cath lab salaries in case the EMTs diagnosis was incorrect. in consultation with the ED physician. unless ICU notified the ED to hold the patient. they were reevaluated by the ED physician. the nursing staffs developed a 1-page report that the ED nurse would fax to the ICU in the event they were unable to complete a telephone report. When they arrived in the ED. Door-to-cath-labtime plummeted to 10 minutes! © M. Daniel Sloan and Russell A. A transfer of the patient to the ICU occurred automatically 30 minutes after the report was faxed. before the cath lab team was notified.

the Emergency Department treated over 37. experimenting on human beings.000 patients and realized a gross margin of $18 million. whether we like it or not. Near zero death rates related to surgical anesthesia and the polio vaccine’s safety record are but two near perfect success examples. off-pump surgical technique provides an ideal compass setting that points the way to breakthroughs. Multi-million dollar savings created by “beating heart” or “off-pump” coronary artery bypass outcomes are a case in point. 2003 .Case Studies 135 Effecting and sustaining significant change is hard work. and denial. The need for change creates strong emotions in people. 1952) The ability to consistently replicate experimental outcomes with a high degree of confidence is of paramount importance to everyone in the health care system.’ At the end of the first year. and a good experiment well reported may be more ethical and entail less shirking of duty than a poor one. economic pressure drives improvement. with an ED diversion time of near zero. This was a 38. All Rights Reserved. This is all normal.26% improvement over the previous year. 1951. before reaching acceptance. Boyles. Med 2:1088-90. Historically speaking. Daniel Sloan and Russell A. The success of this project had a positive impact across the hospital. Again. “Beating Heart” Bypass Grafts Though altruism and evidence influence medical treatments. Since health care Six Sigma © M. Sir Austin Bradford Hill’s 1951 sentiments sound as fresh as a 21st Century General Electric Six Sigma news release: “In treating patients with unproved remedies we are. Hill. medical “Six Sigma” style breakthroughs have astonished the world.” (Br. All departments and staff learned to value ‘their’ ED as the ‘front door to their hospital. particularly when you are the one who is expected to change. A critical function of the Black Belt is to manage people’s feelings and emotions so improvements occur and are sustained. People experience roller-coaster emotions of fear. loss.

Limited financial resources fostered the early 1980’s development of “beating heart” CABG surgeries in Argentina. Database and computing systems accelerate both when they are included in an open system feedback loop. and is forcing. Define: For over 40 years.”6 Closed feedback loops create entropy. the prevailing beliefs of cardiac surgery sustain physician commitment to the on-pump surgical technique.”5 Patterns and pattern recognition are key elements in the identification of breakthrough improvements. 2003 . Though statistical evidence suggested off-pump operations were safe and advantageous for select patients. Good outcomes and the relative ease of working on an arrested heart led most cardiac surgeons to favor the use of CPB. Improve. “Further research should be directed to which subgroups can be operated on to advantage off-pump and which. Closed feedback systems are driven by opaque. Again. All Rights Reserved. groups of patients should be confined to on-bypass operations. spreadsheet analyses and story © M. USA health care markets.136 Case Studies breakthroughs simultaneously improve the quality of patient outcomes and profitability. higher quality procedure has forced. Define. Figure 8 illustrates the classic. Daniel Sloan and Russell A. the classic evidence-based decision cycle. and Control.4 Statistically significant blood utilization and neurological side effects associated with on-pump surgeries were considered to be acceptable—necessary— collateral damage related to the bypass operation. Patient demand for this lower cost. surgeons to master a challenging. “off-pump” coronary artery bypass grafts (CABG) projects are substantive. the use of cardiopulmonary bypass (CPB) pumps defined coronary artery bypass grafting (CABG) procedures. Measure. Analyze. standard Six Sigma closed feedback system. 2003. if any. Boyles. It has taken a decade for surgical practice patterns to emerge that reflect sentiments expressed by researchers in 1992. provides a convenient way to summarize this story. higher standard of care. Compelling statistical evidence is leading to the reluctant acceptance of this surgical technique in competitive. The closed feedback loop idea is a serious theoretical error that can be traced to the 1990’s pseudoscience of “systems thinking.

and administrative leaders are the Six Sigma “executive champions and Master Black Belt” experts who initiate breakthrough improvement actions. In the off-pump/on-pump dialogue. increasingly this data is automatically entered into databases. All Rights Reserved. after a number of his patients canceled their scheduled on-pump surgeries in order to have them performed off pump by a different © M. these discussions are generally sustained without referencing or generating statistical evidence for analysis. Integrated statistical software packages now make it possible to analyze measurement data almost as quickly as they are recorded. Evidence-based decisions must have open feedback systems. qualitative impressions frequently expose opportunities. allied health professionals. Closed loops create entropy. �������������������� ������������� ������������������ ������������������ ������������������� ��������� ����������� ������������������������� ������������������ ������������������������� ����������� ���������������� ������������������������� ���������������� ������������� telling where 83 percent of the information contained in raw data are suppressed.Case Studies 137 ���������� �������� �������� Figure 8 The recommended Six Sigma closed loop feedback system is contrary to evidence-based decisions. Boyles. Obviously doctors. Daniel Sloan and Russell A. nurses. open loop feedback measures. In addition to quantitative. Open feedback systems depend upon the continuous entry and flow of objective evidence into judgments. Figure 9 shows columns and rows of data for a single cardiac surgeon who. Without a commitment to evidence-based decisions. Measure and Analyze: Though surgical practice data are often collected by hand. 2003 . one qualitative signal is the long running practice of opinionated debates between surgeons.

decided to master the offpump surgical technique. lengths of stay (LOS) for patients and type of CABG surgery either off-pump or on. �� � ������������ � � � � � � � � �� � ���� � �� ���� �������� ������ �� ������������ ������ ���� � ����� �������� �� ������ ������ ������ ������ �������� ���������� ���������� ���������� ����������� �������� �������� ������� �������� �������� �������� ����� ��������� �������� ����� �� � �� ������ ����� ����� ���� �������� �������� ��������� ������� ������� ��� ������ ��������� ������� ������� ������� ������� ������������ �������������� ��������� ��������������� © M. 2003 . The peer-reviewed literature on this topic is consistent to a remarkable degree. The computerized analysis of length of stay data in Figure 10 reflects findings that are similar to the 443 peer-reviewed articles published on the on-pump/off-pump subject since 1992.138 Case Studies surgeon at a competing hospital. Daniel Sloan and Russell A. Patients who undergo off-pump CABG surgeries experience dramatically lower lengths of stay. This array documents charges. All Rights Reserved. ���� ������������������ ������������������ �� �� Figure 10 The strong profit signal between the lengths of stay for on pump and off-pump surgeries are eye catching with a statistically accurate “flying saucer” graph. On the hyperspace vector analysis applied to a data matrix thrill ride. Figure 9 A data matrix arrays historical data so a vector analysis can be used to identify profit signals. the difference between data sets is significant at the 95% confidence level if the saucers can fly past each other without crashing. Boyles.

so does variation around the mean.Case Studies 139 Figure 11 The Profit Signal in patient Lengths of Stay (LOS) were related to off-pump CABG surgeries. Factors we considered were diagnostic (ICD) code variations. Boyles. 1. Literature searches used to cross check statistical inferences are a value added service physicians appreciate. provides another view of the impact off-pump surgical technique brings to the quality of patient care. A quality control chart. All Rights Reserved. An example is shown in Figure 12’s cube plot. 2003 . In Figure 12. was a result of an off-pump surgery with a male patient with ICD code 36. all four of the shortest lengths of stay related to CABG are located on the cube’s left plane. All of the © M. These improvements were dramatic. The shortest average length of stay. The Cartesian coordinate system’s cube is an ideal graphic for presenting multidimensional statistical evidence. Daniel Sloan and Russell A. this pattern has symbolized the classic breakthrough pattern of an evidence-based decision. Since 1931. and race. The numbers contained in the rectangular boxes at the cube’s corners are average values.875. gender. age.11. Figure 11. Even a novice can interpret the results at a glance. The surgeon’s database was stratified to facilitate a threedimensional statistical analysis to consider the effect a number of other factors might have had on length of stay outcomes. These breakthroughs now lead to near perfect performances known as Six Sigma. As the average length of stay shrinks. co-morbidities.

All Rights Reserved. The longest average length of stay. Daniel Sloan and Russell A. the data matrix software used to produce the evidence in this case did not have that feature. evidence charts. Though three factors are presented simultaneously. 6. This case study did not include a Pareto chart analysis summary for two reasons. leaders must prioritize cost accounting if they expect to see system wide improvements take place. Before changes occur in physician or hospital practice.140 Case Studies longer lengths of stay are located on the cube’s right plane. Boyles. Improve: Sixteen years of experience in promoting breakthrough improvements in health care quality and productivity teach an important lesson. Though this reality can be disheartening for caregivers who put patient safety first. © M. We can say with a 95 percent level of confidence that when off-pump surgeries are used on appropriate patients. they produce medically superior outcomes and lower lengths of stay. In addition. Figure 12 Profit signals compare the surgeon against herself. benefits must be translated into a compelling financial story. Simulation modeling using spreadsheets is a relatively easy data matrix tool to master. 2003 . Decision makers wanted to look at advanced. First.12. Six Sigma style. More often than not.875.000 or more iterations of multivariate spreadsheet practice scenarios is significant. the only statistically significant factor related to a lower length of stay was a surgery performed off-pump. was the effect of on pump surgeries for men with ICD code 36. the organization had progressed beyond the need to present data in a simplistic way. spreadsheet simulations are persuasive. The psychological impact of seeing 1.

When the medical staff and other senior leaders are disciplined. Off-pump patients avoided adverse side effects while the hospital enjoyed improved profitability. The degree of success in every Six Sigma breakthrough is directly related to the level of commitment that is demonstrated by senior leadership. statistical analysis. this change could produce as much as $1.4 million. Analyze. Actual results fell near the center of the prediction parameters. © M. Measure.Case Studies 141 Figure 13 shows the profit signal’s probable financial impact for one surgeon. Control: The final step in the Six Sigma DMAIC (Define. an additional 448K in revenue would be generated. These results are classic hallmarks of a Six Sigma style breakthrough. Boyles. On the high end of the distribution. All Rights Reserved. Figure 13 Spreadsheet add-ins for modeling and simulation are a compelling. and when they role model the use of science. Improve and Control) process is to standardize breakthroughs and hold the gains. and systematic experimentation. Daniel Sloan and Russell A. Discipline is as important to success here as it is with each of the other steps. 2003 . The low end of the forecast’s distribution suggests that by mastering the off-pump procedure for the majority of her patients. Savings were achieved through lower nursing care costs and overhead. Six Sigma culture evolves along with the breakthroughs. Leadership and culture determine the rate of adoption for breakthroughs in productivity and quality. Revenue gains for offpump surgeries are predicted to range from a net gain of 448K to $1. breakthrough improvements occur. persuasive use of the data matrix and profit signal analysis.45 million.

We had no way to evaluate this. Define: If you try to use a belt beyond a certain point. He proposed doing a designed experiment to determine whether or not the new rule was more cost effective than the old rule. He had a hypothesis that using the belts a little longer would reduce the belt expense with no loss of grinding efficiency. He thought it caused them to discard the belts too soon. the area manager and the supplier rep to discuss the project.142 Case Studies The Daily Grind Don worked in the belt grinding department. The other major expense for the area was the cost of belts. The supplier representative had given the area manager a rule to use for deciding when the grinders should throw a belt away and put on a new one. Daniel Sloan and Russell A. They went through a lot of belts on a typical shift. The purpose of the rule was to minimize the total expense of the operation. © M. To our surprise. He also suspected that the supplier wanted to sell more belts. so we let it go. Don thought the rule was wrong. your efficiency in removing metal goes way down. He said the “50%” rule was based on extensive experimentation and testing at his company’s R&D laboratory. Don had come up with a new rule called “75% used up”. Boyles. The grinders were paid a handsome hourly rate. There were examples of belts that had been “50% used up” hanging on the walls in the grinding area. We met with Don. the supplier rep was vehemently opposed to the project. He said we were wasting time trying to “reinvent the wheel”. Day after day. he and his co-workers removed “gate stubs” from metal castings to prepare them for final processing and shipping. All Rights Reserved. The rule was called “50% used up”. 2003 .

All Rights Reserved. The response variable was the total cost for each casting divided by the amount of metal removed. Table 2 The data matrix for Don’s grinding experiment. We thought he had a point. The area manager also thought Don had a good point. He felt that both grits should be represented in the experiment to get realistic results. It also suggested that high land-to-groove (LGR) is better than low. He gave the go-ahead for the project. A third reminded Don that belts of at least two different grit sizes were routinely used. and rubber wheels are worse than metal ones. The contact wheels currently used on the grinding tools had a low land-to-groove ratio (LGR). they suggested other things that could be tested at the same time. Table 2 contains the data matrix for the grinding experiment as it was eventually run.Case Studies 143 Don argued that laboratory tests may not be good predictors of shop-floor performance. © M. Measure: Don figured he could get 16 castings done in one day. Another wanted to try a contact wheel made out of hard rubber instead of metal. Boyles. There were four factors at two levels each. One of the grinders wanted to try a wheel with a higher LGR. When the other grinders heard about the experiment. He allowed Don one full day to complete the experiment. 2003 . We were also starting to see why he was suspicious of the supplier. Analyze: An eyeball analysis applied to Table 2 suggested that Don was on to something with his “75% used up”. The total cost was calculated as labor cost plus belt cost. Daniel Sloan and Russell A.

but everyone was happy. It predicted significant savings in line with Don’s idea. Use contact wheels with the higher land-to-groove ratio. 2003 . The strongest signal was the comparison of steel to rubber contact wheels (MATL). The combined impact of these two changes was a predicted cost reduction of $2. The actual savings came in a little under the prediction.75 per unit of metal removed. Don’s recommendations were quickly implemented throughout the grinding department. Figure 14 shows the Pareto Plot ranking the factors and their interactions by the strength of their profit signals. This multiplied out to about $900. Not bad for a one-day project. The third-largest signal was the comparison of a low to high land-to-groove ratio for the contact wheel (LGR).144 Case Studies Figure 14 Pareto Plot ranking the factors and interactions in the belt grinding experiment by the strength of their profit signals. All Rights Reserved.000 in annual savings. Daniel Sloan and Russell A. Use his 75% rule instead of the supplier’s 50% rule. Improve: Don’s experiment produced two recommendations: 1. The message here was that the actual cost reductions from implementing the USAGE and LGR results would different for the two grit sizes. Boyles. This signal told us that rubber was not a good idea. But let us not be hasty. 2. © M. The next two signals involved interactive effects. The next-largest signal was the comparison of the 50% rule to the 75% rule (USAGE).

The average cost per revision is about $2000. a very fast and focused review of the die tuning process. We don’t know if our recommendation was ever implemented.8 million or more each year on die tuning. “Die Tuning” for Vinyl Extrusion A vinyl extrusion operation receives a “die package” (blueprint) from a customer for a new “profile” (part).Case Studies 145 Control: Some degree of cost reduction was achieved by all the grinders. testing. Once the production line stabilizes. An extruder can easily spend $1. The tester is also supposed to determine the best run conditions for the new die. Attacking this variation was the obvious next step. but it did not apply uniformly. All Rights Reserved. the total cost varies unpredictably from $2000 (no revisions needed) to something like $50. and revising dies is called die tuning. The process of machining. the tester does visual inspections and measures the control dimensions with a caliper. Each “revision” involves re-machining the die. There was still a lot of variability in grinder performance. the revision programmer sends the die back to the machine shop with a revision sheet describing the needed changes. a tester runs that die on one of several extrusion lines reserved for testing new die.000 (lots of revisions needed). Boyles.5 to $5. Daniel Sloan and Russell A. The inspection results and the dimensions are taken to a revision programmer who determines whether a revision is needed. 2003 . As a result. Reducing the dramatic variation in the number of revisions was identified as a project with potentially huge financial benefits. The extruder then designs and machines the “die” (tooling) for extruding the profile. The number of revisions required to get a new die ready for production varies unpredictably from 0 to as high as 30. Once the initial machining of a die is completed. Potentially these factors could include: © M. Define: We started with a “Kaizen-blitz”. The extruder bears the development cost in exchange for a life-of-contract “sole supplier” status. If it is.

Examples are lowering the line speed or increasing the weight. The trial-and-error method has virtually no chance of finding good run conditions. © M. Die revisions were based on single measurements taken by a hand-held caliper on plastic parts. 2003 . Our findings were as follows: 1. But this process held the promise of dramatically reducing the number of revisions. 2. All Rights Reserved.146 Case Studies • Line speed • Die-to-calibrator distance • Calibrator vacuum • Screw Revolutions Per Minute (RPM) • Screw oil temperature • Barrel zone temperatures • Die zone temperatures • Melt temperature • Melt pressure • Weight Testers are under time constraints. We proposed that small series of designed experiments be made a routine part of die tuning. let’s see if we can “process our way out” of some of the dimensional or cosmetic problems. die-to-calibrator distance and weight. 3. In all industries the repeatability of such measurements is notoriously bad. The basic idea was this: before we cut metal again. The other variables tend to remain at “baseline run conditions” assigned before the die is machined. This would require more time for each revision cycle. Boyles. Letting testers choose which variables to adjust may have long-term economic consequences. Daniel Sloan and Russell A. They adjust some of these variables by trial and error to get the dimensions closer to nominal and improve the cosmetic quality. The variables most commonly adjusted are line speed. Item 1 looked like a possible “smoking gun” for the problem of too many revisions.

By doubling the line speed and reducing material costs by 50 percent the production line produced perfect quality product after just one revision and some very minor additional die tuning. because this table is a data matrix. The answer was that the results of a DOE are always based on weighted averages rather than individual measurements. Analyze: A matrix of distribution curves was the result of jointly optimizing all 14 response variables. A correct analysis breaks up the variation vector in the cornerstone of evidence into Noise and Profit Signals. This means that two profiles are extruded at the same time. Remember. There were four continuous factors at three levels each.Case Studies 147 We felt the Design of Experiments (DOE) approach could address all three. Results for the two profiles are distinguished in the matrix as Sides 1 and 2. die-to-calibrator distance. Some of the team members wondered how it could help with Item 1. Boyles. calibrator vacuum and weight. where higher is better. the team decided to observe four continuous factors: line speed. We used statistical software to generate a data matrix similar to one shown in the first six columns of Table 6. The die in this case had a dual orifice. Measure: For the initial experiment. Please accept our apologies for the fact that the complexity of this statistical graph exceeds the boundaries of this introductory book. The responses included 13 control dimensions and a 15 distortion rating. where higher is better. This automatically improves the reliability of the data used to determine revisions. The matrix in Table 6 is the as-run version with the weights and calibrator vacuums actually obtained in place of the nominal values in the original matrix. The statistical software performed this optimization in just a few seconds. Daniel Sloan and Russell A. Additional key findings were as follows: • We were able to run a four-factor die tuning experiment in one day. Improve: The implications were staggering. The control dimension data are expressed as deviations from nominal in thousandths of an inch. The levels of the four factors are coded to protect proprietary information. The response variables included 13 control dimensions and a 1-5 distortion rating. All Rights Reserved. © M. each column is a single entity or vector. 2003 . Table 6 The data matrix that fills the next page is from the die tuning experiment. The quick story follows.

All Rights Reserved. 2003 . Boyles. Daniel Sloan and Russell A.148 Case Studies © M.

Discovery Magazine.grolier. A conservative estimate of the annual cost reduction from extending this method to all new die was $1. It may also contribute to problems with quality. former hospital administrator and certified Six Sigma black belt wrote this case study for us. More is expected. 54:1085-92.html Cheryl Payseno.com/presidents/aae/side/knownot. All Rights Reserved. Daniel Sloan and Russell A. et al. Control: The process of changing the way die tuning is done is underway. which in turn lead to a larger numbers of revisions.” Ann of Thorac Surg 1992. pages 78-83 as reported by Richard Dawkins on page 20 in his book Unweaving the Rainbow. In one case a die was saved in the nick of time from going back for an incorrect revision that would have spawned further revisions to repair the damage. Similar experiments have been run on other new die with similar results. March. 1998. Albert J. 4 © M. Results from those early innovations were published by the American Society for Quality’s Quality Press.. an RN. Much has been accomplished. Cheryl led the charge for the use of Designed Experiments in health care in 1995 with Daniel Sloan. “Coronary Artery Bypass without Cardiopulmonary Bypass. 2003 .2 million. others contradicted prior beliefs. Zaki. Matt. This locks in unnecessary costs for the life of a contract. 3 Pfister. • We showed that using weight and line speed as adjustment factors in die testing lead to unnecessarily high weights and low line speeds. “Oppressed by Evolution”. half of the current annual budget for die tuning. Salah. Some results confirmed prior beliefs. M. 1 2 http://gi. Boyles.Case Studies 149 • We generated a wealth of information on how each factor affects each response variable. Endnotes Cartmill.

Boyles. 6 © M. Doubleday Currency.. “Coronary Artery Bypass without Cardiopulmonary Bypass. All Rights Reserved. Peter M. Daniel Sloan and Russell A. Salah.150 Case Studies Pfister.”Ann of Thorac Surg 1992. New York. 1990. 2003 . The Fifth Discipline. M. Zaki. 54:1085-92. et al. Albert J. 5 Senge. The Art and Practice of The Learning Organization.

business plans. Profit signal vectors literally and figuratively show you what works best in any business. the graphic presentation of evidence paves the way to breakthroughs in quality. They want to use them. Though they are busy. People want to play with them. everything looks like a spike. I’ll die with a hammer in my hand. telephones and the Internet. Boyles. and trying to explain why actual monthly financial results do not fall exactly on the predicted straight line of a one-dimensional “variance” analysis. productivity and profitability. When all you have a sledgehammer. 2003 . health care. This chapter explains how vector analysis applied to a data matrix showcases the information contained in raw data. The spreadsheet is the first and only computing program many business people learn to use. © M. radios. manufacturing or service process.Chapter 5 Using Profit Signals P rofit signals show you the money. financial. These people are occupied reworking Proformas. Profit signals are like televisions. Hammering out spreadsheet revisions keeps employees occupied. they may not necessarily be productive. Once the tools have done their job. All Rights Reserved. They attract attention.” Vector analysis applied to a data matrix is the steam engine that humbles them. This natural occurrence unsettles to old-school managers. Daniel Sloan and Russell A. cars. Some react like the mythical John Henry: “Before that steam drill shall beat me down.

you don’t have to solve equations. a data matrix channels the intelligence and logic of the best minds our human species have produced. 5. is built into statistical software designed specifically for the data matrix structure. A Better Way to Look At Numbers Think back to your Five-Minute PhD. Each column of numbers is its own vector. you have only one formula to remember. With profit signals. 2003 . All Rights Reserved. number 5. With profit signals.152 Using Profit Signals Fortunately. Money. Physical models win hearts. In other words. With profit signals. Boyles. Daniel Sloan and Russell A. Profit signal pictures are aesthetically pleasing. 1. 2. The look and feel of an Analysis of Variance tetrahedron in one hand and a single stick in the other. Profit signals help you make more money with less work. Each number is framed in the geometric context of a profit signal vector. which dates back to Aristotle. 4. Rigorous inductive and deductive reasoning. In a data matrix. you can produce 10 times the work in a fraction of the time now spent doing arithmetic with a spreadsheet. is THE big reason Six Sigma projects are so popular around the world. Each column is a field or variable with a precise operational definition. people can weigh evidence in their own hands. convert wouldbe 21st Century Luddites into evidence-based decision champions. The following are a few of the many reasons why so many former skeptics embrace the use of profit signals to make more money. 3. By constructing a cornerstone-of-evidence tetrahedron using bamboo skewers as vectors and spheres of Sculpey Clay as points-in-hyperspace connectors. the cornerstone of evidence is appealing. © M. each number is an integral part of an entity called a vector.

Boyles. Dr. long ways. no arrows. Daniel Sloan and Russell A. If you have a pair of scissors and quality paper use them. First. Rogers created the helicopter analogy while working at Digital Equipment in Marlboro. Madison in May 1995. Box.” He and his colleagues used the helicopter in Figure 1 to illustrate. We strongly urge you to actually build a Sculpey-Clay/bamboo skewer model whenever the dimensions of a vector analysis are revealed to you in one of our examples. If you are in a hurry. follow the folds at the bottom to form the helicopter’s © M. Massachusetts.5 inch by 11 inch paper in half. These tools will make the construction process more satisfying. pointing to the money. Box introduced us to it at the University of Wisconsin. Please take a moment to build one now so you can follow along with our data explanation.” Finally. There is no sense of purpose. 153 Measurements presented in the rows and columns of a spreadsheet convey no sense of unity. Commonsense relationships between numbers are ignored. was the Fisher Professor of Statistics. cut or tear the top section to form the “blades. A ghost named Zero inhabits empty cells.Using Profit Signals Vectors show you the money. P. tearing paper works fine. Arithmetic is the two-stroke engine running Abacus Prison. There are no vectors. All Rights Reserved. a Fellow of the Royal Society and the American Academy of Arts and Sciences. Corrugated Copters C. It costs about one buck for the whole kit. Each number is an orphan locked in its own cell. Vectors have physical properties. These properties can be measured and displayed in three dimensions. “You could tell the answer just by looking at the numbers on a cube. tear a piece of 8. Next. Professor George E. Box was also a riveting teacher who taught us that an analysis of variance was so simple. 2003 . B. Logic takes a back seat to manipulation. The results will be more rewarding.

more enlightened view is wordier: “The best way is the most profitable way. 2003 . Boyles. the blades will catch air while the aircraft spins to the ground. For each second of additional flight time. time the flight using the black. each helicopter costs $9 million to build.1 Their original corporate slogan was. Corrugated Copters learned a big lesson when their company was founded in 1996. For this game. blue. or body. Like seeds from a maple tree. Daniel Sloan and Russell A. ��� ������ ���� ���� ���� ����� ���� ���� ����� ������ fuselage. Hold the finished product with the blades perpendicular and away from the body at shoulder height. All Rights Reserved.154 Using Profit Signals ������ Figure 1 This inexpensive product is an analogy that works well for teaching data matrix and vector analysis principles to people in all industries. Eliminate all costs associated with take offs and you can really make money. You may tape the body to give it some rigidity if you like. or pink plastic digital chronometer you wear on your wrist. © M. Now. Let it drop.” This saying has become a ritual chant that opens all management meetings. purple. This is fun to do and fun to watch. customers are willing to pay an additional $1 million in price. Longer flight times are worth quite a bit more money than shorter flight times. “Drive down costs!” Their current.

the brains behind Corrugated Copter’s success have been. The quality and cost of that tree affects the quality and cost of your building materials. and inventory turns. some oil. The paper began as a seed that was planted on a tree farm in the Pacific Northwestern United States in 1948. Not everyone is cut out to be a helicopter pilot. Another ships it as Input to the pulp mill. the Supply. and money is time. The pulp mill Process creates the paper. The most efficient routes for delivering these devices to your engineers are annotated with dollars. Complexity surrounds Corrugated Copters. times. Not everyone could hope to be a timer. Corrugated Copters is the retail customer who buys it from the wholesale customer. Process. and Customer (SIPOC) flow diagram. Since time is money. Daniel Sloan and Russell A. Output. It almost goes without saying that collecting data is a big job. 2003 . The store that supplies this watch keeps a supply of them on hand just in case you need a new one in a hurry. and ore. Accuracy matters. Last and certainly not least. The analysis of that data is yet another specialized task that has its own job classification. You and your products are parts of a system. the calibration of this instrument is exceptionally important. Just-In-Time has eliminated almost all of Corrugated Copter’s inventory costs. © M. You will learn Corrugated Copters is a behemoth that demands global logistical support. The company’s measuring device is a five-mode wristwatch with alarms. to varying degrees. It breaks hours into hundredths of a second. Input. One company cuts down the tree. One of your employees has created a Lean flow diagram to show the entire value stream for your watch. It used to be silicon. educated. Boyles. All Rights Reserved. The packaged Output is sold to its wholesale Customer. The pen or pencil you used to record your measurements also has an informative SIPOC diagram archived for reference in the event another new Six Sigma breakthrough is needed.Using Profit Signals 155 Take a moment now to draw a Six Sigma Supply. The market is filled with uncertainty and risk.

Dick and Mary produced a double-digit flight time! Just yesterday they booked a recordbreaking 10 seconds.” She adds. All Rights Reserved. “I can’t wait to see the rest of your evidence. © M. They are going to wait it out and hope for the best. Boyles. “Ten seconds. no one argues with her fundamental point of view. They are proud of themselves and bragging when Avona walks in.” More than anyone else in the company. The problem is how to determine which way is best. What an awesome and terrific flight time!” Avona cheers. Some employees have heard quite enough of her New Management Equation speech. Avona is committed to evidence-based decisions. money is won and lost. “The best way is the most profitable way. 4. Some think Avona is goofy. Copter teams seem to argue amongst themselves. over when and how many times she will say the word “evidence” in a meeting. the midmanagement team of Tom. and 5. Daniel Sloan and Russell A. During a recent productivity breakthrough. When Avona joins their dialogue. Some suspect her peculiar predisposition is a genetic disorder. In any case. teams just naturally converge on answers that lead to a consensus and a “path forward. has a Five-Minute PhD. Others think she is crazy like a fox. 2003 .” “Evidence?” asks the team. a Corrugated Copter senior executive. Avona will listen only to stories that have evidence in their punch lines.” On this they are in full agreement. Bets are routinely placed. That would be a profit of one million dollars. Avona is often called upon to facilitate meetings.156 Using Profit Signals Testing the Current Way of Doing Things Avona Sextant. Though there is resistance to her methods. They suspect that Avona’s little formula for calculating Chance variation only works with simple numbers like 3. “That’s worth ten million dollars in gross revenue. When she is not in the room. They also know this Six Sigma stuff is a passing fad. On this there is a considerable amount of debate.

Using Profit Signals 157 “This is so exciting. she had programmed a worksheet with vector-analysis formulas built into the cells. Her Excel spreadsheet immediately produced the vector analysis displayed in Table 1. “Oh. You must have flown this machine more than once. Boyles. I just want to see your other measurements. “Our objective of 9 seconds is a fixed number rather than a measurement.” she explained. Plus I also had an abacus for my backup system. If we average more than 9 seconds when we launch the product line I will be euphoric.2 Because her abacus was a Chinese rather than a Japanese machine. The abacus was the world’s first computing system.” “I don’t think that department ever thought of a column of numbers as a vector.” said Dick. She input the three data points. The new Six Sigma Black Belts in Accounting are changing history!” said Avona. When Avona first saw vector analysis applied to a data matrix. “I have no idea how © M.” The team showed her all their data: 9 seconds. “I used to use one of those.” said Avona. she knew the time had finally come to retire her abacus and her spreadsheet too. With her templates. “Is that what accounting calls a variance?” “Good call Mary. Yes it is. Mary observed.” She drew the picture in Figure 2 to illustrate the vector analysis of the difference data in Table 1. I see you’re using a spreadsheet. Daniel Sloan and Russell A. In the meantime. “This gives us the Differences vector. 8. she learned long ago to translate binary numbers into regular old numbers and back again with the flick of her right index finger.” Avona’s aunt in Hong Kong taught her how to use an abacus when she was a little girl. Avona was still waiting for her statistical software purchase order to be approved. 2003 . people didn’t have to type in any formulas. Otherwise we won’t make any money in the long run. All Rights Reserved. So the first step is to subtract it from the raw data. “They do now.9 seconds and 10 seconds.

01.158 Using Profit Signals ������������� �������� ����������� ������������ ������ ������ � � ���� � � ���� ���� � � � ��� ��� ��� � ���� ���� � � � �������� ������ � ��� �� ������������������ � ������ ������ � � � ����� ������ ���� ���� ��� � ���� ���� ���� ����� ���� ��������������� �������������������������� �������� ����������������� ��������������������� � ������ ������������������������� ����������������� � ������ ��������������������������������������������������������������� ������������������ �������������������������������� Table 1 Vector analysis for testing the current helicopter design against the performance objective. Some have already doubled their personal productivity.” “Oh come on Avona. Noise is calculated by subtracting the Profit Signal value from the respective value in the difference vector. “That’s how the New Management Equation works.” © M.74?” she shined (Table 1). The profit signal coincides with the average difference from the 9 second objective. It has one degree of freedom because it is determined by a single number—its average—0. Daniel Sloan and Russell A. See Figure 2. even for a Merchant of Venice. It’s wonderful. is equal to the sum of the squared lengths 0. All Rights Reserved. With a properly designed data matrix.” “See how the squared length of the difference vector. 1. they are going to solve the 1. Our Black Belt CPA Peruzzi told me the tip off for her was the word “double”. “Having those numbers add up is no big deal.000 year old waste and rework problems related to the 14th Century’s double entry bookkeeping system. the second entry is needless rework.3. 9 seconds. Everything always adds up. This arithmetic is a Law of the Universe. Peruzzi is convinced the entire double entry ‘bookkeeping system’ is nothing more than a massive hidden factory loop. The raw data are flight times in seconds.” chided Mary. Get it? ‘Double entry? Rework entry?’ Well.27 and 0. Boyles. “Our Black Belt CPAs are arraying entries into a data matrix. It is a marvel what Six Sigma education and training can do. 2003 .

The best performance the team could expect would be that future flight times were unlikely to fall below a lower “threesigma limit” of about 7. Daniel Sloan and Russell A. Avona starting talking about evidence. Though the team was tired of Avona’s boundless enthusiasm. I always have a hard time remembering that a negative number like –0.” complained Tom. All Rights Reserved. 2003 . future times would vary. “Oh it is. with a push of the square root button on their calculators they could see that the sample standard deviation—the square root of the 0.3 turns out to be a bigger negative number!” “The last time I saw this stuff was when I had to learn to use a slide rule in Mrs. Beamer’s algebra period.1. Their average flight time was about 9. We want to disprove this © M.3.6)].3 – (3 × 0. Even if these numbers perfectly described what would happen in long-run production. It is!” said Avona.Using Profit Signals 159 Figure 2 This is the picture of the key vectors in Table 1.5 seconds [7. the sign becomes a plus? And look how confusing that –0. “Did you notice that when you multiply a minus times a minus. To make matters worse.5 = 9. Laws of the Universe strike again.37 Variance—was about 0. minus a positive number like 0.6 seconds.4 in the Noise vector column is. “The null hypothesis here is that our future average flight time will be 9 seconds. Boyles.

“The data will show this is true if and only if the p-value is small enough.” The room was quiet. A p-value less than 0. This is not good. the time could vary up to 10.8 and all the way down to 7. 0. Boyles. Daniel Sloan and Russell A.160 Using Profit Signals hypothesis.15 gives a ‘preponderance of evidence’ against the null hypothesis. our average gross revenue will be exactly equal to our cost. To illustrate the implications of her conclusion. just a lot of noise in our system. All Rights Reserved. “There is no signal here. These differences are Table 2 Standards of evidence probably due to Chance. and losing money on the other half.428. © M.” table.2 seconds. and construction. The best way is the most profitable way. wind. This means our long-run profit will be zero. paper. We want to make money. not even close to the lowest standard. We want the average flight time to be higher. This means there is no evidence at all that the average flight time is significantly different from 9 seconds. Our p-value is 0. Avona drew the picture in Figure 3. We will be making money on half. By international standards.05 gives ‘clear and convincing’ evidence against the null hypothesis (Table 2). “Also. if the mean is exactly 9 seconds. Depending on variations in the weather. timing device. 2003 . It is a Law of the Universe. pilot. a p-value less than 0. $9 million.

encouraging number. 0.6 seconds. they needed to know more before they could launch the new product. Ancient Greek mariners used the sextant to navigate the Mediterranean Sea’s lucrative markets. Daniel Sloan and Russell A. In his little book. Boyles. Ten was an exciting.8 8.6 10. 2003 .2 7. © M. the team ended their argument with an agreement. As usual. Posterior Analytics. All Rights Reserved.2 10.8 Everyone had taken a liking to Avona’s signal/noise analogy months ago.4 9 9. Aristotle equated the right triangle with truth. (2) They might need to go back to the drawing board and find a way to further increase the flight time. They all agreed with her interpretation of their data.Using Profit Signals 161 Figure 3 The Normal distribution of flight times if the mean is 9 seconds and the standard deviation is 0. But. They could do more tests of the current design to strengthen the signal and reduce the noise. For 2500 years the right triangle has shown us the route to profitability. This would let them determine the average flight time with greater accuracy.3 Applying vector analysis to a data matrix on a regular basis is a good way for today’s seekers of truth to learn about Aristotle’s principles.6 7. There were two possibilities: (1) The problem might just be the small sample size of 3. Six Sigma experts know that the New Management Equation discovered by an old Greek named Pythagoras 2500 years ago is worth billions of dollars today.

challenging our accustomed beliefs. communications engineers from Marconi in 1901 to Nokia in 2003 have appreciated the value of a high signal-to-noise ratio. Once on board. All Rights Reserved. In our consulting practices over the past 20 years. Like an earthquake that rattles our faith in the very ground we’re standing on. I must use a computer to do math. Math phobia is another.” Take Daniel Sloan for instance: “Numerical dyslexia. 4 “Some people consider science arrogant—especially when it purports to contradict beliefs of long standing or when it introduces bizarre concepts that seem contradictory to common sense. these executives and analysts become vital assets for breakthrough project teams. Just like Avona. Executives may find the data matrix and vector analysis distressing. Until they get the hang of using these tools. Peiffer’s fourth grade classroom. customers want a strong signal. anxiety© M. Controllers. The data matrices and vector analyses employed by engineers differ only superficially from the matrix and vectors you used to earn your FiveMinute PhD. just prior to his death in 1996. Boyles. an astronomer and television celebrity. as in telecommunications. both concepts tend to terrify cost-accounting analysts. I can no more do math in my head than I can read the letters at the bottom of an eye chart without my glasses. Confronting math phobia was the most painful. 2003 . Financial Analysts. Chief Financial Officers.” wrote Carl Sagan. and Chief Executive Officers. Chief Operating Officers. reversing numbers instead of letters. has plagued me since I memorized my times tables in Mrs.”5 The transparent analysis principles in the cornerstone of evidence shake the foundations of business decisions. we have found that some of the people who fear math most are Accountants.162 Using Profit Signals In analysis. I must wear glasses to see. Overcoming Obstacles “Science phobia is contagious. Daniel Sloan and Russell A. Many corporate officers “did not do well in high school algebra. shaking the doctrines we have grown to rely upon can be profoundly disturbing. even more daunting obstacle on the high road to evidence-based decisions.

” Still not having her statistical program. It can and does persuade executives and line workers alike to face and overcome both these phobias. Privacy is exceptionally important to adult learning.” Avona’s eyes opened wide. money motivates. learning programs deliver privacy. $500 million corporation. It looks like there might be a genuine difference between the two different helicopter designs. reliable. my stint as a Senior Vice President in a publicly traded. 2003 . All Rights Reserved. Computerized. “Way to go. “We think we have some evidence you are going to like. multiplication. It has been as rewarding as it has been difficult.Using Profit Signals 163 provoking. Boyles. One of the best things success has given me is the opportunity to help other business leaders like me take that frightening first step forward. “Overcoming my math phobia was a more strenuous challenge than all of my five years as a Vice President of Marketing. © M. and very user-friendly software makes vector analysis as easy to learn as sending an E-mail.” Larger than science and math phobias combined. publishing five peer-reviewed statistical textbooks. They are available for adults who suffer from science and math phobias. Six Sigma is a cultural business force that compels people to step up to a difficult task. Comparing Two Ways of Doing Things “Hey Avona!” shouted Tom. she entered the data into one of her spreadsheet templates and showed them the vector analysis in Table 3. The best news for executives and workers alike is that cheap. Experience shows. subtraction. downright embarrassing. is the fear of losing one’s job. Math Blaster. and humiliating career step I ever took. Alge-Blaster. Just look at this stack of numbers. Pro-One’s CD-ROM multi-media course Mathematics. Daniel Sloan and Russell A. division and the order of operations. They are fun. and many other programs are great ways to re-learn the principles of addition. and founding and running my own business for 14 years. Private tutors and educational consultants are other options that work well. personal.

“Right. Daniel Sloan and Russell A. the minus sign disappears. All Rights Reserved. “So Table 3 is where your ‘cornerstone of evidence’ comes from?” asked Mary. “What a waste of time.2. I just happen to have a supply in my desk drawer. 0. Just look at my models (Figure 4).164 Using Profit Signals “I can’t believe it took me an hour to program this worksheet template so it will act like a data matrix. 2003 . The average variation for pink helicopters is 0. The raw data are flight times minus the objective of 9 seconds.2 seconds of flight time. We can make a model of your data and our new Analysis of Variance using some bamboo skewers and Sculpey Clay. Table 3 Vector analysis for comparing two helicopter designs. Boyles. The profit signal consists of the average variation for each design.” Avona loved evidence. The squared lengths of all the vectors are connected by their part in the New Management Equation (NME).” Avona complained.” © M. determines it. but patience was not her long suit. The average variation for white helicopters is –0. The Profit Signal Vector has one degree of freedom because a single number. When the numbers in this column are squared.2 seconds of flight time. The labeled edges correspond to the vectors in Table 2. I sure hope the purchase order for my statistical software gets approved soon.

“We haven’t talked about that last vector in the back of the tetrahedron. which equals 0. which equals 0.38.89 inches. “I sure wish we had our data matrix software. which equals 2. I didn’t include it in my spreadsheet templates because it isn’t important in the type of experiments we’ve been doing. The length of the variation vector is the square root of 0. She told people they were symbolic. A Polydron regular tetrahedron model is next to a cornerstone of evidence. It is a Law of the Universe. Avona played with all sorts of modeling toys. All Rights Reserved. The noise is so short it will be buried completely in the Sculpey Clay.57 inches. which equals 0.06. It’s tremendously important in response surface experiments. The profit signal and noise vectors are the fine print in a vector analysis. “Let’s use my $1 handheld calculator to help us cut the bamboo skewers to length.24 inches. We will use inches as the units. © M.32. She would go on and on to anyone who would listen about some artist named Alexander Calder. Her office was filled with them. This is the vector of hypothetical predicted values. The length of the data average vector is the square root of 8. The length of the raw data vector is the square root of 8.62 inches. Boyles.83. The length of the noise vector is the square root of 0. Differences in raw data change the dimensions. 2003 . It is silly for us to use a hand calculator.38. Daniel Sloan and Russell A.Using Profit Signals 165 Figure 4 The cornerstone of evidence represents any vector analysis. The length of the profit signal vector is the square root of 0. which equals 2.00. That’s where we are optimizing over several continuous variables.

” Mary hypothesized. “Say. It gives the predicted average flight times for the two designs. All Rights Reserved.88 inches.” “Absolutely right. we get the prediction vector by adding together the profit signal and data average vectors. “The null hypothesis is that there is no difference between the designs. 2003 . I think Sculpey Clay is a Six Sigma product. look at that p-value in the table. I just realized if you set one of those up on its end. I am going to need to bake mine in the break room toaster oven for a few minutes so the clay firms up and holds onto the vector skewers. Daniel Sloan and Russell A.” said Avona.” “Wow. It has two degrees freedom because it is determined by two numbers. “There really is a difference between the two designs. A p-value less than 0.01 gives evidence beyond a reasonable doubt against © M. which equals 2. it even looks like a radio tower sending out profit signals. “Could we have hot pink Sculpey Clay points in space instead of green ones?” “We sure can. “It sure is colorful. Boyles.” noted Dick.” Table 4 The vector of hypothetical predicted values is the sum of the profit signal and data average vectors.166 Using Profit Signals “Anyway. In this case. looking at their model.32.” (Table 4) It is always just a tad shorter than the raw data vector. the length of the prediction vector is the square root of 8.” said Tom tearing his gaze away from Mary’s profit signals radio tower.

one degree of freedom for the profit signal and three degrees of freedom for the noise vector and voilà.” said Avona. let’s include the green design in the comparison. 2003 .999. Boyles.000 in profit per helicopter.” © M. Phenomenal work team! “So. “The spreadsheet actually has a formula called FDIST that calculates the p-value. They still came out best. We don’t have much data on that. See you guys later.” said Dick. “We even checked them against green helicopters. All Rights Reserved. you just plug in the F ratio value.Using Profit Signals 167 the null hypothesis.001 from the number 1. Daniel Sloan and Russell A. “Everyone can see pink helicopters are best. “Pink helicopters are best.4 seconds. And while we’re at it. This means we can be 99. Plus. See. Even though the average difference is only 0. We are shredding that straw man like a mogul field at Mount Baker. Dick and Mary sang out. Why is Avona such a stickin-the-mud? And why does she keep saying ‘we’ when she really means us?” “Just be grateful she didn’t talk about evidence again. “But before we release the pink design to production. which design works best?” “What is most profitable is best!” Tom.9% confident that there is a difference between the pink and white designs.” “Gee whiz Mary.” “It certainly looks that way. There is hardly any noise in this data at all.” said Tom after Avona had left. It was named after Ronald Fisher. the vector analysis is sensitive enough to detect it. It is almost all profit signal! “By subtracting the p-value 0. let’s do a confirmation experiment.” Comparing Three Ways of Doing Things “Wow! I think we are onto something with these pink helicopters. we get the number 0. that’s another $400.

Daniel Sloan and Russell A. © M. our new CEO from Uzbekistan. I wasted more than that last week dinking around with my spreadsheet templates. In other words.” said Avona. The third number. pink is best.125 and . The raw data are flight times minus the objective of 9 seconds.” Avona’s analysis is presented in Table 5.325.004 says there is evidence beyond a reasonable doubt that this is false. He’s gonna love this. Boyles. The white and green design flight times are 0. is minus the sum of these two. Maybe he will get me two copies of my data matrix software. at least one of the designs is significantly different from another. they cost less than a thousand dollars. 2003 . It has two degrees of freedom because it is determined by two numbers. Table 5 Vector analysis for comparing three helicopter designs. Shoot.168 Using Profit Signals “And what is best is most profitable. we can see that the pink design flies 0. The profit signal consists of the average variation for each design. The p-value of 0. “Let’s plug your numbers into my spreadsheet template. Which one is best? From the profit signal.0. The null hypothesis is that all three designs will have the same average flight time. All Rights Reserved. -0. 0.125 and 0.200 in this case.200 shorter than average.9 seconds. Once again.325 seconds longer than the average flight time of 0. I want to show it to Rotcev Sisylana.

and Mary. “Do we have to start over. paper-clip ballast. Let’s do a cube experiment!” “Oh no. I learned that the way to maximize the evidence in an experiment is to study several factors at the same time. When I got my PhD. 2003 .” After carefully observing a few flights she noticed something the others had missed. All Rights Reserved. and blade length. As shown in Table 6. The cube experiment they decided to run had three factors: color.” “Thank you Dick. “This table looks just like all the others except it’s taller. Now it’s cubes. “It’s bad enough when she talks about evidence.” Tom and Mary said nothing. But we are wasting time and money by analyzing only one factor at a time. “Have you noticed that the pink helicopters have longer blades than the white and green ones?” “What?” blurted Tom and Mary. “It’s obvious that flight time should depend on blade length. “Not completely. Dick.Using Profit Signals 169 After reviewing the results in Table 5 Dick observed. We’ve spent $216 million and we still don’t know anything about our other product features.” “Of course. but they each wondered why Dick had not mentioned this “obvious” thing earlier.” whispered Mary to Tom.” Avona responded.” added Dick. “Can I see those helicopters first hand? I would love to watch them fly. “We never noticed that before! Maybe it’s actually the longer blades that cause the longer flight times. Avona?” asked Tom. © M. Daniel Sloan and Russell A.” Comparing Eight Ways of Doing Things But Avona was right. Boyles. each factor had two levels (settings or choices). not on color.

this means that using the long blade instead © M. She had finally purchased her own copy of the statistical software and installed it on her laptop. Boyles.20 seconds to the overall average flight time. Richard. That’s $240. All Rights Reserved. we can see that using the short blade subtracts 0. Tom. Also. this means that not adding the weight to the helicopter increases the average flight time by 0.000 additional profit per helicopter sold. we can see that adding weigh subtracts 0. except it’s wider. Overall.047 and 0. Daniel Sloan and Russell A. 2003 . she first showed everyone the vector analysis in her spreadsheet template (Table 7).028.24 seconds compared to adding the weight. “By looking at the profit signal vector for blade length (Z). we can see that using the long blade adds 0. By looking at the profit signal vector for paper clip (Y).170 Using Profit Signals Table 6 The data matrix for the cube experiment run by Avona. we can see that not adding the weight to the helicopter adds 0.12 second to the overall average flight time. Anyway. As a point of comparison.20 second from the overall average flight time. respectively. Avona had lost patience with her senior management peers.12 seconds from the overall average flight time. it looks like we have two statistically significant profit signals. Dick and Mary. Overall.” “Thank you. “I notice this table is just the same as the others. Also. The p-values for paper clip ballast and blade length are 0.

000 additional profit per helicopter sold. An interactive effect exists when the effect of one factor depends on the level (choice or setting) of another factor. Up came the Pareto chart in Figure 5. Boyles. “The combined effect of these two changes is an increase of 0.The raw data are flight times minus the objective of 9 seconds. XZ and YZ mean?” Avona said. 2003 . Y and Z are code names for the three factors. of the short blade increases the average flight time by 0.64 seconds. Daniel Sloan and Russell A.000 additional profit per helicopter sold.” Mary asked. But what do XY. We’ll make millions. Tom. All Rights Reserved.40 seconds.” Tom asked. That’s $400. Usually there are. Dick and Mary. In this case there were no significant interactions. “I know that X. “Is that why it was OK to just add together the effects of paper clip and blade length?” “Exactly!” Next. This means a total of $640. “They are code names for the interactive effects among the factors.Using Profit Signals 171 Table 7 Vector analysis for the cube experiment run by Avona. Avona opened her statistical software and clicked her mouse a few times. © M.

and Mary. wing tape. Dick. aerodynamic folding. © M. All Rights Reserved. Figure 6 shows the data matrix for the experiment. “That is 28 or 256 combinations. They found out customers wanted a quick visual analysis of which factors have the largest effects. and body tape. body length. 2003 . “Have you become a bar chart bamboozler?” “Not really. ���� ���������������� ��� � ���� ����������� ����������� ����������������������� ������������ ������������������������ ������������������������ �������� ����� ����� ����� ����� ����� ���� Everyone was taken aback to see Avona use a bar chart.” The team built 16 helicopters with different configurations using two different levels of Rotcev’s 8 factors: paper type. It’s just that modern software manufacturers are smarter than they used to be. Everyone can see just by looking which factors make the biggest difference. body width. That would cost us $9 million times 256. Daniel Sloan and Russell A. “But with our data matrix software we can screen all eight factors with only 16 helicopters. Boyles.” said Avona. “For heaven’s sakes Avona.” “Rock on!” shouted Tom. or $2. Comparing 256 Ways of Doing Things “Rotcev wants us to test eight different variables. including the flight times that were obtained.” cried Mary.172 Using Profit Signals Figure 5 Modern statistical software presents analysis results as pictures.3 billion!” “Good thinking. That would cut our R&D costs by 94 percent.” complained Mary. blade length. paper clip.

” observed Dick. “If I read this right.Using Profit Signals 173 Figure 6 Statistical software automatically determines the hyperspace geometry for testing eight different variables simultaneously using only 16 experiments. The software calculated the vector analysis in less time than it took to click the mouse. Figure 7 Statistical software automatically rank orders each factor according to the size of its Profit Signal strength. © M. Boyles. The Pareto chart ranking the eight factors by strength of signal is shown in Figure 7.” said Avona. “It looks like we could be over-engineering our product.” “Very astute thinking Dick. 2003 . “I think you just figured out a few good ways for us to make more money.” complimented Mary. Daniel Sloan and Russell A. All Rights Reserved. makes a difference. including the expensive paper. Very few of the other factors.” “Roctev needs to meet this team and hear about these results soon.

always. Variation is everywhere. It makes analysis fast. © M. intuitive. when used to make business decisions. A data matrix and the rules of a vector analysis sort profit signals from noise. There is no need to work equations. Evidence is the length of the profit vector divided by the length of the average noise vector. Statistical evidence is a ratio. Strong signals are easy to understand. Chance or random variation is a phenomenon of nature. you and your colleagues will simply be able to see the answers to problems. all data sets. Boyles. Daniel Sloan and Russell A. leads to consistently reliable predictions and Six Sigma style profits. facts can be seen by anyone at a glance. 2003 . Wait a few moments and weigh yourself again. The right triangle vector illustrations in this book show how all measurements. For example.174 Using Profit Signals Chapter Homework Think of these two elements—profit signals and noise—by using your cell phone as an analogy. Variation surrounds every measurement and measurement system. Why? Everything varies including your weight and the system used to measure it. weigh yourself on a bathroom scale and record this measurement. Deliberate analytic speed saves enormous amounts of time. and statistically significant inferences possible. The strong signals in our exercise data matrix came from the two factors that influenced the outcome. correct. can be decomposed into these two parts. You will discover that your weight may vary by as much as six to 10 pounds per day. Evidence. Weigh yourself every hour and keep a running record throughout the day. All Rights Reserved. Noise. In this way. Noise or static are impossible to decipher. Noise has its own vector. be they complex or simple as pie. The deceptive simplicity of 23 cube arrays makes visual. So long as you stick with the inherent discipline of a data matrix.

Page 39. Carl. Smithsonian Institution Press. All Rights Reserved. 6 © M. Princeton. The World’s First Computing System: Where It Comes From. My conclusion is that it is safest to follow the observations exactly. Ballantine Books. Peter L. 2003 . and How to Perform Mathematical Feats Great and Small. Page 32. 1996. M. 1990. one-half of the team that used The New Management Equation to create the airplane. Daniel. Boyles. instead of running them where I think they ought to go. The Demon Haunted World. 1997. 1987. How It Works. Ballantine Books. 5 Jakab. Daniel Sloan and Russell A. Washington. Using Designed Experiments to Shrink Health Care Costs.Using Profit Signals Closing Arguments 175 Orville Wright. Page 328. Jesse. St.1968. Sagan. ASQ Quality Press. Visions of a Flying Machine. 1996. Ackrill. New York. Princeton University Press. Milwaukee. The Abacus. Page 140. Edited by J.”6 Endnotes Sloan. Science as a Candle in the Dark. 1 Dilson. The Wright Brothers and the Process of Invention. New York. New York. Science as a Candle in the Dark. The Demon Haunted World.L. 4 Sagan. Carl. Martin’s Press. comments on the use of data: “I have myself sometimes found it difficult to let the lines run where they will. 2 3 A New Aristotle Reader.

Boyles.176 Using Profit Signals © M. Daniel Sloan and Russell A. 2003 . All Rights Reserved.

difficult task. inventory. All Rights Reserved. Daniel Sloan and Russell A. Never is it recognized that the one-dimensional “prediction” methods mechanized in spreadsheets and institutionalized by business © M.” wrote Stephen Jay Gould in The Mismeasure of Man. Before returning to the exploits of our Six Sigma breakthrough project heroes Mary. earning. “The invalid assumption that correlation implies causation is probably among the two or three most serious and common errors of human reasoning. you may not be surprised to learn vector analysis is the international standard for making predictions as well as for making comparisons. What a wonder. trillions of dollars in corporate and governmental resources are squandered trying to explain prediction errors that are inevitable. Dick. 2003 . Month after month. Avona. and Rotcev an orientation to basic correlation and regression concepts is in order. and other performance metrics. The vector analysis methods for solving prediction problems are known as regression modeling and analysis.Chapter 6 Predicting Profits M aking accurate predictions is an important. This is good news for Corrugated Copters and your company too. Boyles.”1 Experienced managers candidly acknowledge that costaccounting variance analysis is based on this faulty premise. No wonder old school spreadsheet forecasts bear so little relationship to actual business sales. By now. Tom. revenue. “Correlation assesses the tendency of one measure to vary in concert with another.

All Rights Reserved.” 5 Think movies. is the grandchild of Francis Galton’s imagination.3 This breakthrough soon found its way into courtrooms of judgment and justice around the world. bifurcations. Think prime time. while he is occupied in some special inquiry. Daniel Sloan and Russell A. Galton wrote. Most all forensic evidence presented by the entertainment industry in whodunits.”2 Though Galton did not capitalize the word Generalization in this instance. but among these fingerprints are the most cost-effective. are fingerprints. Think Disney. 2003 . the same year the grandfather mentioned in our Premise—the man who used paper bags and arithmetic to cipher out his farm’s business transactions because he didn’t trust the new fangled way of doing things called multiplication—the genius Galton was pioneering the use of fingerprints as forensic evidence. The graphic statistical results of vector analysis. reality TV. and that his results hold good in previously-unsuspected directions. applied to a data matrix. The voice. The Generalization of which I am about to speak arose in this way.178 Predicting Profits school curriculums were made obsolete in 1890 by Charles Darwin’s half-cousin Francis Galton. They are the fingerprints every process leaves behind. iris. Fingerprint Evidence It is an entertaining and obscure footnote in the history of evidence-based decisions that by 1893. “Biometrics is a technology that analyzes human characteristics for security purposes. and face can be used in addition to fingerprints. endings and statistically significant patterns. this Generalization. we did so readers could see he was speaking about a true Law of the Universe. suddenly perceives that it admits of a wide Generalization.4 Quoting from Internet sales literature. In his 1890 essay Kinship and Correlation. This technology. Each fingerprint data set exhibits unique swirls. “Few intellectual pleasures are more keen than those enjoyed by a person who. © M. Boyles. Think Spielberg and Lucas. hand.

Profits are too important to be left to the Chance coincidence that a paranormal guess will sometimes be right.6 This evidence adds up. c2 = a2 + b2. it must have some merit. science fiction. or criminal investigations—all use the Pythagorean Theorem. divining rods. Three Wishes Cost-accounting variance analysis has been around almost as long as Aladdin’s Lamp. the weight of evidence we present in this chapter is appropriate. Remember. Given the stakes of international commerce. and rub it. Even worse. A vector analysis applied to a data matrix is a correct analysis. no. Serious biometric predictions—be they concerned with acute lymphoblastic leukemia. This chapter is weighty. Ever. Boyles. it is the mother of whitecollar waste and rework. G. All Rights Reserved.Predicting Profits 179 murder mysteries. Here are the three wishes: © M. Tarot cards. therapeutic vaccine studies. and good old-fashioned wishful thinking. Daniel Sloan and Russell A. Charter Harrison’s standardized cost model was a step forward in 1918. Surely. palmistry. past lives. consider the traditional break-even analysis pictured in Figure 1. You know the secret handshake and inside joke. In 2003 it is too simplistic to satisfy international standards for quantitative analysis. If the reading gets a bit heavy for you. We will not. Look for the right triangles. belong in a dust bin with auras. and gumshoe adventures is based on the New Management Equation. If we rub it. Will we not be granted our three wishes? Unfortunately. Forecasts conjured without the cornerstone of evidence and the New Management Equation. You will never have to do any of these calculations. Data matrix software takes your data and lays it all out for you. peek at the illustrations. the Genie will appear. Those pictures are our wink at you. Rune readings. soothsaying. and rub it again with our erasers. For example. 2003 .

Even a widescreen Disney genie would decline this opportunity. © M. compares and contrasts wishful thinking with reality. Boyles. As numbers in the first column increase. Noise. They get rave reviews in management meetings. These are plotted in Figure 2. The mythological Greek Sisyphus had a better chance of rolling his rock to the top of his hill than a manager has of making his monthly results fall exactly on a hypothetical straight line. rather than standards of evidence. Daniel Sloan and Russell A. Chance or random variation. This perfect linear relationship produces perfect predictions. Dollars ������� (Averaged Expenses) ������ ���� � �� �� aged �� er e) v (A com In ��������������� ����������������� ���������������� Product or Service Volume 1.180 Predicting Profits Figure 1 The traditional break-even analysis is a good example of wishful thinking in the white-collar work place. attends every measurement. numbers in the second column increase by an exactly proportional amount. 2003 Table 1 . I wish the relationship between these lines never changed. are in use. I wish my expenses were exactly a straight-line function of volume. 2. I wish my revenue were exactly a straight-line function of volume. This sort of line is a sure sign that shenanigans. 3. All Rights Reserved. Granting these three wishes would be equivalent to suspending the physical laws of our universe.

a huge amount of variation. A single-number prediction is useless without a statement of prediction error based on the degree of variation in the process being predicted. Daniel Sloan and Russell A. you guessed it.Predicting Profits 181 Table 1 Wishful thinking versus reality. 2003 Figure 3 . All Rights Reserved. Boyles. shows the actual performance numbers on which the linear relationship was based. We need to see the profit signal and noise vectors. The ‘wish profits’ are hypothetical straight-line predictions. © M. There is. The ‘real profits’ are actual results.

We have observed the opposite. Even the best of intentions cannot redeem a patently false premise. All Rights Reserved. A single-number prediction is useless without a statement of prediction error based on the degree of variation in the process being predicted. They waste time and money that might otherwise find its way to the bottom line. © M. They are exercises in futility. 2003 . Daniel Sloan and Russell A. Figure 3 The straight-line predictions were based on real profit data containing a huge amount of variation. Arbitrary goals are products of wishful thinking.182 Predicting Profits Figure 2 Wishful thinking results falling exactly on the straight-line prediction earn rave reviews in management meetings. Boyles. They demoralize and debilitate the people assigned to achieve them. A persistent leadership “homily” suggests that idealized targets “inspire” superior performance.

This would be relatively easy to do. Then I noticed there is quite a bit of variation around our target blade length. “Let’s say your data looks like this (Table 2). First I noticed there is quite a bit of variation in our flight times. We could use that knowledge to make more profits. with a knowing smile. “My hypothesis is that drop height could be used as another control variable. off-setting change in the drop height.” said Avona. 2003 . We already know blade length is a key control variable. if we could predict flight time from drop height. Boyles.” said Mary. we have to know which is the © M.Predicting Profits Prediction Practice 183 “Say Avona. It would be like knowing what the stock market is going to do tomorrow.” “Wait a minute.” “You know physics. we could compensate for a variation in blade length by making an opposite. “This isn’t the same as the other things you showed us. Tom told me it would be difficult and expensive to put tighter controls around our tolerance specifications. but can you be a little more specific?” asked Avona. we could hit our advertised flight times with much less variation. Finally. I noticed there is also quite a bit of variation in our drop height. In problems that involve a relationship between two variables. “I have an idea. All Rights Reserved.” said Avona. What is most profitable is best.” said Avona. I have a hunch we might even be able to predict the flight time from the drop height. This way. You would probably call it a hypothesis. We’re trying to determine a relationship. “Well. Daniel Sloan and Russell A.” “That’s a great idea. Can you give me a preview of how we’re going to do this?” “With pleasure. We aren’t trying to find the best way of doing something. “Make it so.” said Mary. “But why would you want to predict flight time from drop height?” “You know perfectly well why! If we can accurately predict performance we can anticipate the future. so your hunch has my attention. Then.” “Yes.

medium and high drop heights. independent variable and which is the dependent variable. 9.67. which for us is the flight time. Daniel Sloan and Russell A. The letter X is used to symbolize the independent variable. Notice that 8. At 1 I got up on my desktop. 0 and 1 are codes for low. here is the vector analysis from my spreadsheet template for this practice problem (Table 3).5.92. The letter Y is used symbolize the dependent variable. The best-fitting straight line is shown in Figure 4. 9. All Rights Reserved.5 and 11. It takes care of everything.” “OK. but what’s this ‘coded drop height’ about?” asked Mary.42 plus 1. “The only difference between this vector analysis and the ones we did before is the way the profit signal vector gets calculated. The dependent variable is the one we want to predict. “The independent variable is the one we will use to make the prediction. we might try something more elaborate.25. “Anyway. For our actual flight times 8. Look closely and you can roughly see that the slope of the predicted line in this case is 1. Also notice that 9. If we had more data. Boyles.25 equals 10. We’re modeling flight time as a linear function of drop height. “I know it would look better if we put in the actual drop heights.67 plus 1.25 equals 9. but it’s easier to explain if we use the codes.184 Predicting Profits Table 2 The data matrix array for three flight times paired with three different drop heights. “The values -1. At zero I was standing.42.67 and 10. we don’t have to bother with the coding when we use our statistical software. It’s also easier to draw the picture and set up the spreadsheet template.0 the straight line predictions are 8. © M. At the minus setting I was sitting in my chair. Of course. 2003 . which for us is the drop height.92.

We get the profit signal vector by multiplying this slope times the coded X data vector. Boyles. “Can you tell me whether or not the slope is statistically significant?” Figure 4 Best-fitting straight line for Y as a function of coded X. All Rights Reserved. © M.25 Y units for every coded X unit. “This means the forecast goes up 1. 2003 .Predicting Profits 185 Table 3 Vector analysis for fitting Y (flight time) as a linear function of X (drop height). Daniel Sloan and Russell A.

If the slope is smaller. the relationship between X and Y is weaker. and the profit signal vector is longer. Here is the drawing for this vector analysis (Figure 5). “Well done.186 Predicting Profits Mary thought for a moment. All Rights Reserved. but how do I use all this to make a prediction?” © M. “If the slope is larger. “It doesn’t achieve the ‘clear and convincing’ standard of evidence. then said. 1. Daniel Sloan and Russell A.0 Raw Data Coded “Notice that the profit signal vector is parallel to the coded X data vector at the lower left.” Now Mary had a question.5 9. This is always true because the profit signal vector is always equal to the coded X data vector multiplied by the slope of the best-fitting line. It does achieve the ‘preponderance of evidence standard’ because the p-value is smaller than 0. Boyles.” Avona exclaimed.5 11.05.” Figure 5 Picture of the vector analysis for fitting Y as a linear function of X. the relationship between X and Y is stronger. because the pvalue is greater than 0.25 in this case.” 8. “Exactly. and the profit signal vector is shorter. 2003 Re gr es sio n Lin e . “OK.15.

If the coded drop height is 0.25 × (0. “If you plotted the predicted Y values versus the coded X values.” answered Avona.67 + 0.0 and 10. 0) up to the point where the noise and profit signal vectors intersect. “Graphically. 0 and 1 is the same as adding together the Y data average vector and the profit signal vector. The statistical software will automatically show you the prediction error. and 1. what we do is start at a coded X value in Figure 4.” said Avona.25 is the slope. “We can be more exact if we are willing to deal with the actual equation of the line: Predicted flight time = 9. if we had an X value of 0.5. don’t we have to state the prediction error based on the variation in our data?” “Right again. The number on that axis is the prediction.25 × (Coded drop height) “9.67 + 1.” © M.67 is the average flight time in our practice data set. they would fall exactly on the straight line in Figure 4.” Mary had one final question. “But let’s save that for when you get your real data.34 “Applying this equation to the coded X values -1.5) = 9. the predicted Y value would be somewhere between 10. go up to the fitted line. For example.67 + 1. The result of this is called the predicted Y vector (Table 4). “Yes. then we get: Predicted flight time = 9. 2003 . It goes from the point (0.Predicting Profits 187 Avona responded.67 = 10. then over to the raw Y data axis. 0. Boyles.” “Is the predicted Y vector visible in Figure 5?” asked Mary. All Rights Reserved. “If we make a prediction.5. “It’s the vector in the shaded plane labled “Regression Line”.5. Daniel Sloan and Russell A.

” observed Avona. Daniel Sloan and Russell A. too. I used coded values -1.” announced Mary as she barged into Avona’s office.” “How in the world did you come up with 81 percent?” © M. Avona. We can eliminate 81% of our flight time variation by controlling the drop height. so there is a real relationship here. 0 and 1 for the low.288 seconds.056 seconds. Predicting Real Flight Times “OK. beyond a reasonable doubt. All Rights Reserved. The p-value is 0. medium and high drop heights. This is astonishing. 2003 . I entered my data into the spreadsheet template you gave me (Table 5).188 Predicting Profits Table 4 Predicted Y vector from fitting Y as a linear function of X. “Good job. Boyles.” “Yeah. I’m giving Corrugated Copters much better information and spending more time with my family. about my study. “Anyway. I have my data now. We did five tests at each of three drop heights. “I see you bought your own copy of the statistical software. I get four times as much work done in a quarter of the time it would take with a spreadsheet. The noise standard deviation is 0.000. “The overall standard deviation of the flight times is 0.

288. Well. I used the last line in Table 5 to puzzle it out.34 × (Coded drop height).” “Thank you. 2003 . The raw data are Mary’s actual flight times minus the objective of 9 seconds. All Rights Reserved. 0. I think the predictive equation is this: Predicted flight time = 1. Mary. © M. I took a lucky guess. Now check me on this next thing.288.056 is 19.056. The standard deviation of the noise vector is 0. Daniel Sloan and Russell A. Boyles. It all added up so I figured that is why you put the last line in your spreadsheet template.Predicting Profits 189 “That was hard.60 + 0. See the standard deviation of the variation vector is 0.” admitted Avona. “Impressive. “Great job.4 percent of 0. Is that right?” Table 5 Vector analysis for fitting flight time (Y) as a linear function of coded drop height (X).

and the 95% prediction limits. said Avona.” answered Avona. given the reduction in variation you’ve demonstrated. the best-fitting line. And. Your equation is right on the money. with 95% confidence. The 95% prediction limits are approximately two noise standard deviations above and below the line. Your noise standard deviation is 0.60 is the overall average of your Y data. Two times this is 0. Mary. Let’s do that now. your predictions are more accurate. © M. When they are wider. “The upper limit is approximately two noise standard deviations above the predicted value.056 seconds. “Yesterday.” Avona clicked her mouse two or three times to produce the graph in Figure 6. When these limits are narrower.112. I said I would show you how to use the statistical software to get a picture with predictions and prediction limits. predictions from your equation will be accurate almost to plus or minus one tenth of a second.” “Is there an easy way to calculate the limits?” asked Mary. All Rights Reserved.” Figure 6 Picture of the best-fitting straight line for predicting flight time (Y) as a linear function of coded drop weight (X). and the lower limit is approximately two noise standard deviations below the predicted value. Daniel Sloan and Russell A. 1. I mean ‘money’ in the literal sense. 2003 . your predictions are less accurate. So. Not bad. “Yes.190 Predicting Profits “Yup.34 is the slope of the best-fitting line using coded X data. Boyles. “You’re batting 1000 today. “This shows the data. and 0.

which nearly corresponds to the observed one. The Remarkable Story of Risk. will surely experience discontinuities. Boyles.html The origin of this web page is the comprehensive http://www.W. like sweet peas.com/galton/ © M. “If nature sometimes fails to regress to the mean. 1 Galton’s complete works are available from a variety of library and Internet sources. starts potentially into existence. Galton recognized that possibility and warned. The Mismeasure of Man.mugu. ‘An Average is but a solitary fact. Against the Gods. New York: W. Norton & Company.html 2 3 http://www. Daniel Sloan and Russell A. an entire Normal Scheme.7 “Forecasting—long denigrated as a waste of time at best and at worst a sin—became an absolute necessity in the course of the seventeenth century for adventuresome entrepreneurs who were willing to take the risk of shaping the future to their own designs. comments on the value of prediction and regression. “Commonplace as it seems today. New York Times.Predicting Profits 191 Closing Arguments Peter L. This quote comes from http: //www.mugu. the development of business forecasting in the late seventeenth century was a major innovation. 2003 . page 272. noted economist. 1996. whereas. and author of the Business Week. and no risk management system will work very well. if a single other fact be added to it. and USA Today best seller. human activities. Jay Stephen.com/galton/ start.com/galton/essays/1890-1899/galton-1890nareview-kinship-and-correlation.mugu. All Rights Reserved. Bernstein. economic advisor to nations and multinational companies.’” 8 Endnotes Gould.

The Remarkable Story of Risk.wvu.fujitsu. 2003 . Peter L.pdf Bernstein. All Rights Reserved.fme.pdf 4 http://www. Against the Gods. John Wiley & Sons. Page 182.com/products/biometric/pdf/Find_ FPS.fujitsu. Daniel Sloan and Russell A. Peter L.fme. Page 95. 8 © M. 1966. 1966.edu/~bknc/BiometricResearchAgenda. Boyles. New York. Against the Gods. 7 Bernstein.com/products/biometric/pdf/Find_ FPS.192 Predicting Profits http://www. New York. John Wiley & Sons. The Remarkable Story of Risk.pdf 5 6 http://www.

it is broader than that. Three Mile Island and Chernobyl are monumental governmental management blunders. These responsibilities require a constant demonstration of trustworthiness in economic and personal conduct. Six Sigma performance engenders trust. Boyles. They quantify financial and human risk in ways that can be validated and replicated. Hooker Chemical’s © M. Six Sigma performance. They produce near perfect. All Rights Reserved. Daniel Sloan and Russell A. Jobs and the welfare of one’s community are on the line with every significant decision. Poor quality management decisions can and do injure our world. The privileges of executive corporate leadership are paired with responsibilities. It also includes respect for the people’s moral responsibilities. 2003 . on Six Sigma performance. Six Sigma products and services are powered by evidencebased decisions. More than profit is at stake when a manager begins the workday. But.Chapter 7 Sustaining Results S tewardship entails honorable conduct in the management of other people’s property. Customers confidently bet their lives. the Union Carbide plant explosion at Bhopal. Evidence-based decisions reduce uncertainty. Every corporate director and senior level executive lives in the world of uncertainty.”1 observed William G. The Copper7 Intrauterine Device (IUD) and Thalidomide are quintessential medical management mistakes. Near perfect. and the lives of their loved ones.2 The after effects of the Exxon Valdez wreck in Alaska. India. Scott and David K. Hart in Organizational Values in America.

These are things to think about as we rejoin Corrugated Copters. and is shrinking. Boyles. “Our customers are insisting that we manufacture helicopters with virtually no variation in flight time. All Rights Reserved. Mary and Avona have stabilized their flight times. They told me we have to have a Cpk of 1. but also the political stability of the nations where they do business. Managers around the world now understand. Tom. Seventeen years ago. As one executive told us recently. that how they treat a worker sewing a soccer ball in Pakistan affects not only the outcome of the World Cup. the sinking of Brazil’s PetroBras platform and its resulting million-gallon oil spill. These powerful tools were widely available at the time. They need to sustain their best practices and profits.” This new level of thoughtfulness is a good thing. Nobel Laureate Richard Feynman used an “Avona” model to demonstrate in no uncertain terms that NASA scientists did not understand the concept of correlation.” said Mary. all bear witness to the wide ranging effects of poor quality decisions.3.194 Sustaining Results Love Canal disaster at Niagara Falls. Evaluating Practices and Profits “Avona. What on earth is a Cpk? Is it an abbreviation for something?” © M. or we are out as their main supplier. 2003 . The staff meetings I go to look like the United Nations. We emphasize diplomacy both inside and outside the company. They know which way is best. Unfortunately. Six Sigma has shrunk.5 or better. Daniel Sloan and Russell A.4 It is now widely acknowledged that Challenger might still be flying if NASA managers had applied vector analysis to a data matrix in 1986. our globe. Dick. in dollars and cents. in 2003 it is clear that NASA managers are still basing important decisions on spreadsheet calculations. Our international relationships build the teamwork we need to compete. “Our home office is Earth. Their challenge now is to hold and gain market share. Evidence-based decisions provide a safety net that can help protect us from poor quality judgments.

2003 . companies that cannot or will not produce that level of quality are getting edged out or even kicked out of the marketplace.” Avona pulled out a piece of paper and wrote the following expressions. it’s just another goofy Six Sigma symbol.” “No. Daniel Sloan and Russell A.” “That level of quality is impossible.5 implies no more than 3 or 4 defective products or services per million delivered.Sustaining Results 195 “No. It is a function of the average and the standard deviation. Six Sigma is not just a passing fad. Cpk is a numerical index that quantifies how capable a process is of producing virtually perfect quality output. “Not at all. I’m not. © M. Let me show you how to calculate a Cpk value from your data. This is old news.5 is an accepted international standard. It is an extremely disciplined way of competing for market share. “What?” “Just kidding.” said Dick. A Cpk of 1.” chided Avona. It puts an additional spin on these by combining them with the Upper and Lower Specification Limits.” “What? Cpk isn’t just the same old New Management Equation?” “Like most other things in Six Sigma. Seriously. Boyles. at least for components and subprocesses. who just wandered into the break room with a fresh steamed latte. All Rights Reserved. In fact.” said Avona.” “You’re kidding. Cpk is based on the New Management Equation. “A Cpk of 1.

Boyles.4 and a standard deviation of 0. I’ll use a bell-shaped curve to represent process variation.” “Why are they divided by 3?” asked Mary. We’re stuck with it. Anyway. Cpl is the number of process standard deviations between the process average and the Lower Specification Limit. Everyone uses it.5% of outcomes from this process will fall below the Lower Specification Limit.196 Sustaining Results “Cpk is defined to be the smaller of two other numbers called Cpl and Cpu. Cpl = 2/3 = 0.” “Maybe you could draw us a picture. divided by 3. Therefore. The average is 2 standard deviations above the Lower Specification Limit (LSL) and 4 standard deviations below the Upper Specification Limit (USL). Dick.67 © M.” Figure 1 A process with Cpk = 0.6 (Figure 1). Get it? Process? Lower? Capability? Cpl ? Don’t you just hate the way acronyms aren’t even arranged in order? Cpu is the number of standard deviations between the average and the Upper Specification Limit. This means we want both Cpl and Cpu to be as large as possible.67 and the Cpu is 4/3 = 1. The Cpl is 2/3 = 0. we want Cpk to be as large as possible.2 and an Upper Specification Limit (USL) of 10. Daniel Sloan and Russell A. divided by 3. “Good idea.8. All Rights Reserved. Roughly 2. Let’s say we have a Lower Specification Limit (LSL) of 7. “It’s an arbitrary convention. 2003 . The right triangle actually gives us a convenient place to put the standard deviation. “You could even put a right triangle in it. Here’s a process with an average of 8.” suggested Dick. The average is 2 standard deviations above LSL and 4 standard deviations below USL.67.33.

Boyles.0 and a standard deviation of 0.” observed Mary.5% of outcomes from this process will fall below LSL.00 and Cpu = 3/3 = 1.6 (Figure 2). these numbers are the same. Roughly 0. © M.00. Cpl = 3/3 = 1.00 Cpk is the smaller of these two numbers.Sustaining Results 197 and Cpu = 4/3 = 1.3% Figure 2 A process with Cpk = 1. Therefore. Daniel Sloan and Russell A. The average is 3 standard deviations above LSL and 3 standard deviations below USL. 2003 .3% of units produced from this process will fall below LSL or above USL.” “That’s not good. Avona drew another picture.00 and the Cpu is 3/3 = 1. This implies that roughly 0.33 Cpk is the smaller of these two numbers.67. All Rights Reserved.00. The average is 3 standard deviations above LSL and 3 standard deviations below USL.00. 0. The Cpl is 3/3 = 1. but in this case since the process is perfectly centered. So Cpk equals 1. “Here’s a process with an average of 9. This implies that roughly 2.

67. Daniel Sloan and Russell A. “Here’s a process with Cpk = 1. Cpl = 7/3 = 2. Roughly 32 outcomes per million will fall abve USL. She drew a third picture. Therefore. Therefore. “Here’s a process with Cpk = 1.3 (Figure 3).33. The average is 8 standard deviations above LSL and 4 standard deviations below USL. All Rights Reserved.” “That’s better. 2003 . The Cpl is 8/3 = 2.33. Avona drew a fourth picture. Cpl = 8/3 = 2. The average is 9.67 and Cpu = 4/3 = 1. The average is 9.” said Avona.3 (Figure 4). “But still not good enough for Six Sigma.33 This implies that roughly 32 outcomes per million will fall above USL.3 and the standard deviation is 0. The average is 7 standard deviations above LSL and 5 standard deviations below USL.33.” observed Mary.67 © M.” Figure 3 A process with Cpk = 1.198 Sustaining Results of outcomes from this process will fall below LSL or above USL. The average is 8 standard deviations above LSL and 4 standard deviations below USL.67 and the Cpu is 4/3 = 1.6 and the standard deviation is 0. Boyles.33 and Cpu = 5/3 = 1.

let’s roll some dice. right in the center of the specification range?” Figure 4 A process with Cpk = 1. Mary said. All Rights Reserved.67. The average is 7 standard deviations above LSL and 5 standard deviations below USL. it looks like more of them are in the middle of the range than at the extremes. Roughly 287 outcomes per billion will fall above USL. so Cpk would be 2.” After a moment of silence. 2003 . “Then the average would be 6 standard deviations above LSL and six standard deviations below USL.” said Avona. Daniel Sloan and Russell A.” Mary and Dick groaned. Cpl and Cpu would both be equal to 6/3. now for your quiz. Dick said.” Process Improvement Simulation “Just for grins. a process with that level of capability would produce no more than 2 or so defective outcomes per billion. The Cpl is 7/3 = 2. Mary answered first.” suggested Avona. “I will record them as you roll.” © M. “And FYI.Sustaining Results 199 This implies that roughly 287 outcomes per billion will fall above USL. which is 2.67.33 and the Cpu is 5/3 = 1. “Now that’s something to aspire to. Hmm. Viva Las Vegas!” After a few rolls. “OK. Boyles. “What a surprise! I’m getting numbers between 2 and 12.” “Right on. “What would Cpk be if we moved the average to 9.

Figure 5 Seven is the most probable outcome when rolling two dice. 2003 . Perfect elevens every time. Daniel Sloan and Russell A. We can use the throwing of two or more © M. What’s up with that? Oh.” said Avona. while only one combination will produce either a 2 or a 12. “I know a better way to present the outcomes (Figure 6). Tom chimed in. this way it looks sort of like a bell-shaped curve. “They have lots of games based on probabilities and three dimensional reasoning. wait a minute. Avona. it’s a nice segue into what I wanted to show you.” “Don’t you always?” “See. “The average is about 7 and the standard deviation is a little over 2.” answered Avona. “Something must be wrong with my dice.” “Let work out how many ways there are to get each possible outcome with two regular dice.” observed Dick. “In fact. The only way you can get a different answer is to write down the wrong number. Six different combinations will produce a seven. By fixing the dice so you always get the same outcome. Boyles. One of my die has a five on every side and the other one has a six on every side. “All I get are elevens.200 Sustaining Results She entered the numbers into her calculator.” said Dick (Figure 5). All Rights Reserved. I bet that makes Avona happy.” “It does. where did you get these?” “Wizards of the Coast. you are playing with a Six Sigma process.

Avona showed them how to do a process capability analysis with two mouse clicks. Therefore. It’s just a simulation. If a helicopter has 20 defects. Our initial process involves four dice. Boyles. “The average flight time for our current design is 11 seconds. Each die will represent a cause of defects. That means our average profit margin is $2 million. our Upper Specification Limit is 20 defects. The sum of the dice will represent the number of defects per helicopter. “Can’t we make it a lower number?” “Work with me. Figure 7 Process capability analysis of the initial “four dice” process. Let’s say these defects cost an average of $100. we break even. Daniel Sloan and Russell A. All Rights Reserved. one of them threw the four dice. If it has more than 20 defects.” said Dick. After completing 1000 simulations. one of them called out the result. Dick. (Figure 7).Sustaining Results 201 Figure 6 The frequency distribution of outcomes when throwing two dice. dice to simulate the evolution of process capability through Six Sigma breakthrough projects.” “20 sounds like an awful lot of defects.000 each to repair.” For each simulation. and one of them entered the result into their data matrix statistical program. 2003 . They traded jobs once in a while. let’s get started. © M. OK. they decided they had had enough. we lose money.

using vector analysis of course.” said Avona. and successfully eliminated one of the top causes.56. Cpk is 1.7% of them will have more than 20 defects. using the calibrated eyeball method. Figure 8 Process capability analysis of the new. It would be better to keep them from happening in the first place. The standard deviation is about 3. Avona showed them the capability analysis of the improved process (Figure 8). The standard deviation is 3. Using the bell-shaped curve with ‘Mean’ marked on it. Cpu and Cpk are the same thing. 4.” After completing another 1000 simulations. girl. the mean marked on the small distribution curve and the 0-25 scale just below it in Figure 8. All Rights Reserved. have gone from 14 to about 10. Daniel Sloan and Russell A.7% of helicopters will be ‘Above USL’.” “You got it. prioritized the causes of defects.000 per helicopter. my calibrated eyeball tells me the average number of defects per unit is about 14. The number of defects on 831 parts per million (PPM) will be © M. “Let’s assume now that we have data-mined our process data base. and we will lose money. In situations like this.” “We might also lose market share.” said Mary. 2003 . the number of defects on 4.202 Sustaining Results “We get only Cpu in this analysis because there is only an upper specification limit.05. Our improved process involves only three dice. In other words.4. improved “three dice” process. Cpk is 0. “There’s no guarantee we can catch all the defects before a helicopter goes to a customer. “Average defects per unit. Boyles. This is an average savings of $400. As you can see.

All Rights Reserved. we can detect upward drifts before they cause yield loss. a small one this time. “That’s exactly why we can’t afford to be satisfied with a Cpk that is barely over 1. The number of defects on 78 parts per billion (0. 2003 .000 per helicopter.078 PPM) will be ‘Above USL’.” “Good point.Sustaining Results 203 ‘Above USL’. we’ll make money on all but one helicopter in 10 million.5. using the calibrated eyeball method. we’ll make money on all but one helicopter in a thousand.” said Avona.” noted Avona. Figure 9 Analysis of the ultra-capable “two dice” process. “But what if the average drifts up over time?” “That’s a very good question. She produced a control chart showing © M. and our process now involves only two dice. In other words.75. Let’s say we have done that. Avona showed them the capability analysis of the new process (Figure 9).” After they completed another 1000 simulations. She did another simulation. “With an upper three-sigma limit of about 14 as a control limit.” said Tom.” “That’s not all. The standard deviation is about 2. In other words. Boyles. has gone from 10 to about 7.” added Tom. Daniel Sloan and Russell A.” “This does look a lot better than the old process. “Average defects per unit. Cpk is 1. We need to eliminate other causes of defects so that upward drifts don’t result in yield loss. This is an additional savings of $300.

We can use the average and the threesigma limits of the two-dice process to catch an upward drift before it causes any yield loss. “the two-dice units were evenly distributed around the average. these rules give us operational definitions of when to initiate troubleshooting. Can you see what happened on the chart?” “Yes. everyone would have a different interpretation of the same data. The Upper and Lower Control Limits (UCL and LCL) are the three-sigma limits for the “two dice” process. These are signals that something has changed.” said Dick. and one of them is above the upper three-sigma limit. So what you said before was exactly right. 2003 . Dick. “The dots got a lot bigger. this reminds me of the ‘standards of evidence’ you’re always © M. Daniel Sloan and Russell A. All Rights Reserved. The three-dice units are all above the average.” “Thank you.” “Also. “A control chart uses the average and the three-sigma limits to monitor a process over time. Actually.” explained Avona.” said Avona. Hmm. Figure 10 The first 40 units come from the “two dice” process. and none of them were above the upper three sigma limit. “Otherwise.” “That’s right. I did that myself to distinguish the three-dice units from the two-dice ones.” said Tom. Boyles.204 Sustaining Results what would happen if the causes of defects in the “three dice” process came back (Figure 10). The last 10 units came from the “three dice” process. Tom.” said Mary. “Those are the two most important rules for interpreting a control chart. the last 10 come from the “three dice” process. What about the data in relation to the average and upper three-sigma limit?” “Well. “In my simulation. the first 40 units came from our “two dice” process.

the Executive Committee agonizes over numbers like these every quarter. They try to figure out what went wrong in Quarter 5.Sustaining Results 205 talking about. Daniel Sloan and Russell A.” said Mary. They try to take credit for Quarter 13. Table 1 Quarterly financial report (thousands of dollars). “In many companies. Boyles. Avona showed the team a table of financial results (Table 1). Avona. “but can we take a break first?” Monitoring Practices and Profits After the break. They make bar charts (Figure 11).” “OK. 2003 . and who to blame. Are control charts related in some way to that?” “Bingo!” exclaimed Avona. All Rights Reserved.” © M. “Allow me to explain. and come up with imaginative explanations.

“Don’t we do the same thing?” asked Mary. not just in manufacturing. Looking only at monthly or quarterly totals would lose all the information in the week-to-week variations. Not since Rotcev took over.206 Sustaining Results Figure 11 Quarterly financial results (thousands of dollars).” commented Dick. “but not any more. “We used to. we use all the information in whatever data we have. The profit © M. Boyles.” answered Avona. Avona. we would start with that. “That was the problem. All Rights Reserved. “But I didn’t realize you could use it on financial data. He immediately insisted that we apply standards of evidence everywhere.” “To do this right. “Looking only at totals is a big problem with traditional cost accounting variance analysis. if we look only at quarterly totals. For example. 2003 . “we need to put the numbers into a data matrix (Table 2). Daniel Sloan and Russell A.” continued Avona. They could make the numbers say pretty much whatever our previous CEO wanted them to say. The analysis in Table 2 is exactly the same as if we were comparing 15 ways of doing something. we lose all the information in the monthly numbers.” “Rotcev is almost as enthusiastic about this stuff as you are. “If we had weekly data. “With vector analysis. I thought the accountants had their own special ways of doing things. Then we can apply a vector analysis.” “They did.” said Avona.

Sustaining Results 207 Table 2 Data matrix and vector analysis for quarterly review of monthly financial data. © M. Boyles. All Rights Reserved. 2003 . Daniel Sloan and Russell A.

The noise vector contains all the information in the month-to-month variations. They are plotted in time sequence. the profit signal variation wasn’t large enough compared to the noise variation. the F ratio basically compares the degree of variability in the profit signal vector to the degree of variability in the noise vector. “We can confirm this visually by plotting the profit signal and noise vectors together on a single graph (Figure 13). It gives the lengths of all the vectors. © M.” said Avona. The apparent quarter-to-quarter changes are just noise. It doesn’t meet any standard of evidence. DA TA AV E (335 RA GE VAR IAT (185 ION ) 50) Figure 12 The cornerstone of evidence for the vector analysis in Table 2. 2003 . In other words. We reached our conclusion because the F ratio wasn’t large enough to achieve a standard of evidence. not signals. Daniel Sloan and Russell A. The overall degrees of variability are about the same.” Mary quickly replied. The three long vectors are not drawn to scale. “The solid line is the profit signal vector and the dotted line is the noise vector. The numbers in parentheses are the lengths of the vectors. “Here is the cornerstone of evidence for this vector analysis (Figure 12). RAW DATA (3356) 2) NOISE (162) (33 FIT RO NAL P G SI (90) “Who can tell me if there are any significant differences among the 15 quarters?” “There aren’t any. Boyles. All Rights Reserved.” “That’s right. “The p-value is 0.208 Sustaining Results signal vector contains all the information about differences among the 15 quarters. “Now.792.

Sustaining Results

Figure 13 The numbers in the profit signal vector are plotted as the solid line. Each of these is an average of 3 numbers in the variation vector. To make the visual comparison statistically valid, the numbers in the noise vector were first divided by the square root of 3. Don’t blame us, it’s a Law of the Universe. The adjusted noise numbers are plotted as the dotted line.

209

“Creating a graphical comparison like this is a little trickier than it looks. To make the comparison statistically valid, I had to divide the numbers in the noise vector by the square root of 3. This is because each number in the profit signal vector is an average of 3 numbers in the variation vector. I know this is confusing, but it’s a Law of the Universe. “Anyway, in 1924 a man named Walter Shewhart was trying to come up with a good graphical method for analyzing data over time when there is a natural or logical way of grouping the data. For example, we grouped our raw monthly data by calendar quarters. That made sense because the Executive Committee reviews it on a quarterly basis. Shewhart called these rational sub-groups.5 “Instead of plotting the profit signal vector and adjusted noise vector on top of each other, Shewhart decided it would be better to plot just the profit signal numbers, and use horizontal lines to represent the upper and lower three-sigma limits of the adjusted noise numbers. Also, he decided to add the data average vector to the profit signal and noise vectors. He felt this would be easier to interpret. “In other words, he invented what we now call the X-bar chart control (Figure 14). “This control chart tells us the same thing as the F ratio: the quarter-to-quarter changes are just noise.” “I have a question,” said Dick. “Do we have to do the vector analysis all over again every quarter?”

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

210 Sustaining Results

Figure 14 The dots are the averages of the monthly revenues in each quarter, not the totals. The centerline is the grand average of all the monthly numbers. The Upper Control Limit (UCL) is 3 noise standard deviations above the average. The Lower Control Limit (LCL) is 3 noise standard deviations below the average.

“That’s a good question,” answered Avona. “Fortunately, the answer is no. Once we have a good baseline, like we have in this example, we hold the control limits constant and just plot the new numbers as time goes by.” “I guess the fundamental things you’ve taught us really do apply,” said Dick. “OK, here’s your quiz. What are two ‘events’ on this chart that would indicate a real change of some kind?” “A point outside the control limits,” said Tom. “A bunch of points in a row above the center line,” said Mary. “Right on both counts,” said Avona. “Remember, Mary, it could also be a bunch of points below the center line. And, by the way, the usual requirement is eight in a row for a statistical signal.”6 “This is great stuff,” said Mary. “But I’ve been wondering: aren’t we still losing some of the information in the month-tomonth changes?” “Excellent point,” said Avona. “Shewhart was aware of this problem. His solution was to plot the standard deviations of the subgroups on their own control chart (Figure 15). The two charts together give us a complete picture of what’s going on over time.”

©

M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

Sustaining Results

211

Figure 15 The dots are the standard deviations of the monthly revenues in each quarter. The centerline is the average of the standard deviations. The Upper and Lower Control Limits (UCL and LCL) are three-sigma limits based on the standard deviation of the standard deviations. Strange, but true.

Taking Action After the others left, Avona realized there was another basic fact about control charts that she needed to teach them. It wasn’t about how to set up the charts, or how to interpret them. She felt that was pretty easy. She knew from experience that control charts were all too often used as “window dressing”. Maybe “wallpaper” is a better analogy. In manufacturing at least, she knew that control charts add real value only when they are used as a basis for action. She also knew that reacting to control chart signals was a process, just like any other business activity. In order to add value, the reaction process must be defined and documented. It must be improved over time. She had found the tools of Process Mapping to be ideal for these tasks. In her experience, it worked best to have teams of operators, supervisors, maintenance technicians, engineers and managers develop the reaction plans together. She had a reaction plan “skeleton” she always used to get them started (Figure 16). The question “Signal?” refers to one or more pre-defined signals on one or more control charts. The charts and signals are defined by the team that develops the plan. The term

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

212 Sustaining Results

Figure 16 A generic reaction plan “skeleton” for a manufacturing or service process.

“escalate” means to raise the level of the investigation by bringing in someone with greater expertise. Ideally, the manufacturing or service process is stopped until “Continue” is reached. Figure 17 shows an actual example of a reaction plan for a lot-based manufacturing process.

Figure 17 An example of a reaction plan for a manufacturing process.

In this example, the team decided to confirm a control chart signal by immediately taking a second sample from the same lot. If the second sample does not show a signal, the

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

Sustaining Results

213

occurrence is documented and the lot moves on to the next operation. If the second sample does show a control chart signal, the manufacturing process is put on hold while the Operator goes through a pre-determined checklist. The checklists in a reaction plan are determined by the team that develops the plan. That is why it is so important that all vocations are represented on the team: operator, supervisor, maintenance technician, engineer, and managers. If the operator solves the problem, the occurrence is documented and the lot moves on to the next operation. Otherwise, the supervisor is called in. It may be necessary to bring in the engineer, or the maintenance technician, or even the manager. The important point is that the manufacturing process remains on hold until one of two things happen: 1. The problem is solved. 2. Someone of sufficiently high authority makes the decision to resume manufacturing while the problem is being worked on. The keys to the success of reaction plans are: (a) Orderly and consistent evidence-based response to problems as they occur. (b) Visibility of problems throughout the organization, appropriate to their level of severity. (c) Evidence-based decisions made at the appropriate levels of responsibility throughout the organization. A disciplined approach like this is a bitter pill at first. Supervisors and managers object to the loss of production time. After a few weeks or months, the same supervisors and managers are singing the praises of their reaction plans. Invariably, they have seen their unplanned downtime plummet. Problems are being fixed right away, instead of being ignored until they become catastrophes. These short-term economic benefits are overshadowed by long-term improvements in process capability. The old

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

refining it as we learn new things. and does it revolve around the sun? Does multiplication work? Do gravity and electricity exist? Do airplanes fly? Can you buy things on a computer and have them delivered to your door? Are vectors and hyperspace real? All the evidence we have says “yes”.student.ca/~stat231/stat231_ 01_02/w02/section3/fi4. To Err is Human. All Rights Reserved. Transaction Publishers. 1 Committee on Quality of Healthcare in America. 1939. Donaldson. Editors. One day we wake up and.” Walter A.ralentz.pdf and http://www.uwaterloo. It’s impossible to get any better.edu/artsci/ecn/mead/306a/Tuftegifs/ Tufte3. You are asking yourself. but that’s it.. Closing Arguments “The idea of control involves action for the purpose of achieving a desired result. Daniel Sloan and Russell A. “That’s great. Organizational Values in America. Hart. is this really possible? Was Pythagoras right about right triangles? Is the earth spherical. Shewhart. Molla S.html 4 © M. We find ourselves with an ultra-capable “two dice” process.com/ old/space/feynman-report. and.214 Sustaining Results “four dice” process gives way to the “three dice” process. National Academy Press. 2001.” But we keep following our reaction plan. Janet M. David K. We think. Linda T. We find our competitors using us as the benchmark. 1991. Kohn. Boyles. 2003 .math.4. William G.C. to our great astonishment. Statistical Method from the Viewpoint of Quality Control. Washington. the impossible has happened.7 Endnotes Scott. D.. Page 139. Building a Safer Health System. 2 http://www.html 3 http://www. Corrigan.uri. New Brunswick.

Statistical Method from the Viewpoint of Quality Control. Inc. 7 © M.Sustaining Results Shewhart. Inc. Walter A. Boyles. Republished in 1980 by American Society for Quality Control. D. 1984. Economic Control of Quality of Manufactured Product. Dover Publications. copyright 1956 by Western Electric. New York. Inc. 1931. Statistical Quality Control Handbook. All Rights Reserved. 5 215 AT& T Technologies. Copyright renewed by AT&T Technologies. Walter A. Van Norstrand Company. Daniel Sloan and Russell A. 6 Shewhart. 2003 . 1986. New York.

216 Sustaining Results © M. Boyles. 2003 . Daniel Sloan and Russell A. All Rights Reserved.

our inclinations.2 American history provides an excellent road map for redefining the 3 Rs—Reading. Knowledge and skill necessarily change the nature of authority. they cannot alter the state of facts and evidence. proposing three distinct grades of education. and vectoR analysis. 2003 . and aRithmetic—to Reading. or the dictates of our passions.”3 Adams and his colleagues were as passionate about intellectual liberty as they were about freedom. John Adams wrestled with evidence as we all do.Chapter 8 The Three Rs E ducation and training are the first steps in building an organization founded on evidence-based decisions and the New Management Equation. All Rights Reserved. characteristically. teaching all classes.” Thomas Jefferson wrote not only to Adams. “Education should be on the spot. to the point. “Liberty cannot be preserved without a general knowledge among the people. a systematical plan of general education should be proposed. Daniel Sloan and Russell A.1 Trust. “Facts are stubborn things. Those of us who enjoy the rare combined privilege of Untied States © M. wRiting. Boyles.” This was a bold proposal. and the best method… I call for the education of one million and thirty thousand children. and I was requested to undertake it. and whatever may be our wishes. I accordingly prepared three bills for the Revisal. “We thought that on this subject. decency and respect replace fear and favor as social adhesives. but to all of us in his 1821 autobiography.”4 Jefferson’s friend and ghostwriter got. wRiting.

Big desk.5 Paine’s outlandishly impractical investment scheme turned out to be the bargain of the millennium. Tom and Mary listened. “Anyway.” Roctev. and Mary looked at Dick’s map (Figure 1). Once you showed me the Pareto chart that rank-ordered the factors. Heck. I would have had a real problem. “Obviously I had a hard time understanding those spreadsheet tables. Paine’s good deal can yield even better bottom line business results. Before that. I drew a flow diagram yesterday. I had it all. “Thanks for not making fun of me when my lights were out. Then I say dumb things I wish I could take back. “Numbers make me nervous. Two secretaries and a million-dollar advertising budget.218 The Three Rs citizenship and an American public education can thank Thomas Paine.” complimented Roctev. Avona. Avona. Boyles. Four phone lines.” said Dick. “You are one brave guy. We predict that in the new millennium. Then I get embarrassed. That darned picture kept me awake all night long. All Rights Reserved. “I can see exactly what you mean by my comfort zone. “The last time an employee told me I was full of baloney was when I was a Vice President of Marketing. They were smiling. if I couldn’t make myself look better than everyone else at the annual review. © M. I was driving my wife crazy tossing around. Tom. Six Sigma’s Hidden Factory “I have been thinking about what you have taught us Avona.” Roctev the CEO.” said Dick. Daniel Sloan and Russell A. my light finally went on. I began getting uncomfortable in 1986.” Everyone stared at Roctev. So I got up at 3 AM and came into work. Big office. 2003 .

” “You know. “So. You know the one with all the numbers?” © M. She was none too pleased to have me pay her a social visit. not because of cost accounting. do you. and since the lobby was vacant at one AM. 2003 .” she said. “One day I was working a night shift to ‘get close to my employees. All Rights Reserved. I tried to keep up the conversation by expressing an interest.’ I sat down next to a clerk. What do you mean exactly?” “Well I read that bull poop memo about your recent management decision. Daniel Sloan and Russell A. Projects are delayed and deferred. you guys in senior management don’t have a clue.The Three Rs 219 Figure 1 The hidden factory of traditional Six Sigma. “Ummm. after about 30 minutes of chit chat. I believe she was 19. she told me she had homework to do. Bold proposals can make people so uncomfortable that they would rather waste money than upset the status quo. not the University of Southern California. finance or data analysis. I do remember she was a sophomore at USC. I remember that because USC stood for the University of Southern Colorado. She shut her mathematics text and looked right at me. Boyles.

000 employees could too. COO and me were making. All Rights Reserved. I did decide to confront my math phobia.” said Dick. “Whew. so I said. I will show you. “I thought you’d go seriously supersonic if you thought I was saying you blame accounting and finance instead of stepping up to the plate and just saying. Please show me what you mean. I never did learn what Analysis of Variance meant.220 The Three Rs “Yes. Her whole show took about five minutes.” “Well.’” Roctev looked at Dick. Our layoff plan to save money and our $11 million senior management data analysis on the supposed need for a massive building program just got an F. let’s say your map is true. ‘Well actually I did take that class but I got a C in it and that was a gift. maybe the other 1.” said Mary. Just for the sake of the argument you thought we would have. I am the reason for Six Sigma project rework. Daniel Sloan and Russell A. a ruler. She drew me a picture. “So. My colleagues chose to keep on bamboozling. “I figured if a 19-year-old could see through the faulty reasoning behind the decisions our CEO. ‘This change scares the heck out of me.” “She pulled out a piece of graph paper. CFO. Six Sigma hidden factory of rework. I learned how to draw control charts. My comfort zone is the problem that stops projects. They all got promoted. I do. I thanked her and excused myself. and a pencil. anybody who has taken Statistics 101 at USC can tell you don’t even know how to do an Analysis of Variance. As CEO. Now I am a CEO. What should I do? What would you do?” © M. I have a top-flight team. You. But. a calculator. 2003 . you say we have this gigantic.’” “Here. Many of them still are.” “Wow. Instead of a 19year-old kid with spunk. Boyles. I did not sleep well.” “I had taken Statistics 101 as a foreign exchange student at Baldwin-Wallace College in 1969. “One thing led to another.

It would be less costly. “You know. But people respect you and Mary because of what you know and do. everybody can contribute.” replied Dick. I think we need everybody. Daniel Sloan and Russell A. I even used a cube to outline the three factor interaction. and more effective if we called our Six Sigma program the Three Rs. “Let me get this straight Dick. you just earned your American Society for Quality Six Sigma Black Belt certification.” said Avona. No.” © M. Do you mean to tell me that you are proposing to give it all up for the good of the company?” asked Roctev. “Are you saying that Black Belts aren’t needed?” “No. “All I am saying is.” “Huh?” said Mary. All this Black Belt and Green Belt stuff is overhead.The Three Rs “I would start phasing out the Six Sigma bureaucracy. Reading. “You are an expert. “Dick. not because of your numbered certificates. wRiting. We need experts and good teachers. All Rights Reserved. her models. Avona chuckled quietly and looked at her shoelaces. All the people we work with have imaginations. Let’s hire people who are literate. Everybody has a brain. We could have some fun with it. You are a teacher. simulations. “I’m buying. “Six Sigma is simply the use of evidence-based decisions. Let’s call the way we work literacy and be done with it. 2003 . “Maybe it could be Reading. That idea is as old as Aristotle. Roctev nodded his head. We could just call Six Sigma ‘literacy’.” (See Figure 2) “Literacy?” “Yes. chai and biscotti.” said Tom with a stern look on his face. Whatever. and the software have made Six Sigma so simple. Boyles. and vectoR analysis.” suggested Dick.” “Let’s go get a latte. Avona.” 221 “What!” cried Tom and Mary who had just mounted and framed their Black Belt certificates. or who want to be literate. wRiting. and Refraction.

the cost accounting variance analysis by the Analysis of Variance. by steam. write and vector-analyze numbers.222 The Three Rs Figure 2 Literacy now refers to people who know how to read. auscultation by ultrasound. vectoR analysis wRiting Reading Our Proposal A global workforce that is literate in the Three Rs of the New Management Equation is an excellent value. typewriters by computer keyboards. It is far less costly than the alternatives. Boyles. fortifications by gunpowder.6 His observations ring true as we watch vacuum tubes made almost useless by transistors. made useless by hydraulics. spreadsheets by the data matrix software. by railways. Daniel Sloan and Russell A. by wireless communication. palpation by Magnetic Resonance Imaging. steam by electricity. sails. 2003 . telegrams. transistors by silicon chips. All Rights Reserved. poisonous purple foxglove seed remedies by quality controlled digitalis. © M. roads and canals. “New arts destroy the old. But there is a cost. See the investment of capital in aqueducts.” wrote Ralph Waldo Emerson.

The books of an older generation will not fit ours. Stratocaster and PRS guitars and Cry Baby Wah-Wah pedals are antiques for sale on EBay. and Xeroxed chart template required to produce a statistical process control chart. Boyles. and Luther. To be great is to be misunderstood. Our Les Paul.” Though none of us were able to articulate a proposal for improving the cost accounting variance analysis back then— that being a vector analysis applied to a data matrix—we came to discover that the rest of Emerson’s quote was prophetic. “With consistency a great soul has nothing to do. His daring Total Quality Management (TQM) observation is intrepid today. and Copernicus. as Emerson pointed out. and Newton and every pure and wise spirit who ever took flesh. Can it be our own college students were babies in the eighties? Great Caesar’s Ghost! We are old men. must write its own books. Motorola’s Six Sigma business initiative was designed at a time when a dual 5. Time flies. and Socrates. 9600 Baud was a fast connection. 2003 . though it contradict every thing you said to-day— ‘Ah. Sometimes you just get lucky. the Chief Executive Officer of Northwest Hospital in Seattle at that time. There are bald spots on the back of our heads. pencil. When General Electric got a hold of Six Sigma. He may well concern himself with a shadow on the wall. Harvard Graphics bar charts on a dot matrix printer were breakthrough technology. the Internet and Windows 95 were new. Daniel Sloan and Russell A. adored by little statesmen and philosophers and divines.25-inch floppy disk drive IBM computer with an amber screen was an executive luxury. How can this be? We have grey hair. ruler.”8 Each age.7 “A foolish consistency is the hobgoblin of little minds. while grappling with the hand calculations.’—Is it so bad then to be misunderstood? Pythagoras was misunderstood. and Jesus. We are wearing progressive lens glasses. Speak what you think in hard words and tomorrow speak what to-morrow thinks in hard words again. When did this happen? Do we look as funky as a judo gi and as old as a Six Sigma acronym? © M. and Galileo. criticized cost accounting by quoting Emerson.The Three Rs 223 In 1992. All Rights Reserved. so you are sure to be misunderstood.

2 http://www. and work very hard. 5 Emerson. 1 Wood. New York. page 189.224 The Three Rs Yes. Pages 87-97. Gordon S. Knoph. Self-Reliance. Gordon S. New York. page 189. Collected Writings. 1992. 6 Sloan. Random House. We do. Penguin Books.dropbears. 2003 .com/b/broughsbooks/history/articles/ john_adams_quotations. Ralph Waldo. 7 8 Emerson. edited by Adrienne Koch and William Peden. Success Stories in Lowering Health Care Costs by Improving Health Care Quality. Daniel and Torpey. Jodi B. Milwaukee. and Ury. Page 48. Roger. We must work. 1981. 9 © M. to stay young.9 Endnotes Wood. All Rights Reserved. William. Pages 630-633. Boyles. Alfred A. M. The Life and Selected Writings of Thomas Jefferson. 1992. Daniel Sloan and Russell A. Penguin Books. The Radicalism of the American Revolution. Knoph. New York. ASQ Quality Press. Fisher. We must change with the times. New York. Ralph Waldo. Library of America. Circles from The Portable Emerson. 4 Paine. Thomas. must get to Yes together. The Radicalism of the American Revolution. edited by Carl Bode in collaboration with Malcolm Cowley. Page 229. We must try to age gracefully. all of us. 1944. New York. We must negotiate and we. unlearn and relearn how to do things. Negotiating Agreement Without Giving In. New York.htm 3 Jeffereson. Getting to Yes. We must learn. Thomas. 1995. Alfred A.

Boyles. Daniel Sloan and Russell A. entity or event for which we have collected data. Confidence level – Obtained by subtracting the p-value from the number 1 then multiplying by 100.Appendices I. Analyze. Glossary of Terms: Data Matrix. Improve and Control. Factor – A controlled variable in a designed experiment. The six sides or edges represent the raw data vector and the five possible vector components of variation that can be broken out of any set of raw data. Each column is one of the variables we have measured or observed. which is the Six Sigma project cycle. DMAIC – This is an acronym for Design. Each row is an object. It is a point in n-dimensional space. 2003 . Data matrix – An array of numbers or labels in rows and columns. It is a measure of the strength of evidence in the data against the null hypothesis. A column in a data matrix is a vector. Vector Analysis And Evidence-based Decisions ANOVA – Acronym for Analysis of Variance. Measure. All Rights Reserved. Fisher’s general term for the various forms of vector analysis he developed. Each of the four faces is a generalized right triangle. where n is the number of rows in the data matrix. © M. Data vector – A stack of numbers or labels treated as a single entity. Cornerstone of Evidence – This is a generalized tetrahedron representing a vector analysis.

connecting one point in space with another. an F ratio as large as the one we got. A statistic proportional to the ratio of squared length of the profit signal vector to the squared length of the noise vector. Tetrahedron – A three-dimensional figure with four triangular faces and six edges. A p-value less than 0. Profit signal vector – Same as profit signal. c2 = a2+ b2. Noise – The chance. All Rights Reserved. manufacturing. by chance alone. The ratio of the length of this vector to the length of the noise vector in a correct analysis yields the F ratio that measures the strength of evidence. or service process.01 gives evidence ‘beyond a reasonable doubt’ against the null hypothesis. A p-value less than 0.226 Appendices F ratio – A measure of the strength of evidence in the data against the null hypothesis. It is the vector at the bottom right-hand. normal. common. A p-value less than 0. Pythagorean Theorem – The square of the long side of a right triangle is equal to the sum of the squares of the other two sides. New Management Equation – Our name for the Pythagorean Theorem. © M. It is a Law of the Universe.15 gives a ‘preponderance of evidence’ against the null hypothesis. statistical variation found everywhere in Nature. forward corner of the tetrahedron.05 gives ‘clear and convincing’ evidence against the null hypothesis. 2003 . Vector – An arrow that defines magnitude and direction. P-value – The probability of getting. Boyles. random. Daniel Sloan and Russell A. Vector analysis – The process of breaking up a raw data vector into perpendicular vector components of variation. Profit SignalTM – Quantifies and rank orders which factors impact any business.

1781. He provides a complete listing of applied © M. Newton. This classic document addresses life. and modality are dealt with in one difficult. and the Cartesian coordinate system. virtue. 1917. relationships. liberty. Washington. tackles the complexity of quality. and thinking reflect the British personality in applied science. 2003 . Aristotle’s books on Logic. Darwin. the sequential order of perceptions. Einstein introduces the idea that “the evolution of an empirical science is a continuous process of induction. Relativity.” Dr. Eudemian Ethics details the links between a respect for the individual. justice. The Declaration of Independence. challenging text. emphasizes the importance of sequential perceptions. Daniel Sloan and Russell A. Einstein’s ideas on time. measurement. the pursuit of happiness. Act cycle and M. Albert Einstein’s little book. and a good social order. science. Do. Evaluation. knowledge. Nightingale. Study. 5. Hume’s ideas. 1776. and Box are names that can be culturally linked to Hume’s work. 4. Deduction. He specifically describes his use of probability. These texts outline the sequential. and a good social order. the pursuit of happiness. Boyles. Jefferson. Fisher. Einstein specifically honored this work as an inspirational force in his work. Bacon. and mathematics were integral to their lives. writing. All Rights Reserved. Immanuel Kant’s Critique of Pure Reason. The circular logic of knowing. 3. and Test Hypothesis. The study of philosophy. and Politics are essential. 2. the quality of judgments. and other American revolutionaries were Hume’s contemporaries. Daniel Sloan’s IDEA Cycle: Induction. Franklin. Action. Inductive/ Deductive cycle of the scientific method: Hypothesis.Appendices 227 II. Posterior Analytics suggests that the triangle signifies truth. Aristotle’s cycle is the foundation for all science and Walter Shewhart’s original Plan. David Hume’s A Treatise of Human Understanding. They communicated. and analysis are landmarks. 1739. Adams. Eudemian Ethics. The Business Bookshelf 1. the Pythagorean Theorem. Experiment.

is a phenomenal. Clarence Irving Lewis. “On the Probable Error of a Coefficient of Correlation Deduced from a Small Sample. (A good resource for finding them is Collected Papers. Fisher’s works from 1913-1935. Boyles. 6. inductive and deductive logic cycle. Bennet. “Frequency Distribution of the Values of the Correlation Coefficient in Samples from an Indefinitely Large Population” introduces the idea of using geometry to represent statistical samples. Statistical Methods for Research Workers. and Kant. Induction precedes deduction. Hume. The Design of Experiments. Edwards Deming for the extension of the z Table to the 0. 1971-1974). At age 25.” The importance of experimental observations must be connected to the “precise. Ronald A. 2003 . Fisher’s 1915 Biometrika. 1929. seminal work. The University of Adelaide. Galileo. 1931. deductive reasoning” of Euclidean geometry. It applies to samples of any size.” explains the logarithmic transformation of the correlation coeffient r that leads to a near normal distribution. © M.H.1 level of accuracy. 7. Mind and the World Order.228 Appendices science thought leaders: Euclid. Gauss. The philosophy of conceptual pragmatism led to the development of the field of Six Sigma quality improvement. This book details the practical application of a circular. 1935. Volumes 1-5 J. “Inductive inference is the only process known to us by which essentially new knowledge comes into the world. Ed. Descartes. Outline of a Theory of Knowledge. paper entitled. All Rights Reserved. Walter A. The Pythagorean Theorem or New Management Equation is a Generalization. This book inspired Walter Shewhart. He presented a table tabulating the transformation for each value of r. Shewhart’s Economic Control of Quality of Manufactured Product. Fisher’s 1921 Metron paper. 10. Daniel Sloan and Russell A. The thirteenth edition credits W. Kepler. This book includes illustrations and ideas from Fisher’s work and his own unique perspective on the importance of sequential data analysis. 8. 1924.

executive. Here are fourteen points to ponder for a good social order in the workplace to ponder. Induction precedes deduction. “On a Classification of the Problems of Statistical Inference. William G. This article shows the futility of using random samples for analyzing a dynamic process. Ludwig von Bertalanffy’s General System Theory. This is a master work of applied science.A. 1968. Experiment (Executive in character). It is curious to note that Deming’s 1951 understanding of the importance of a designed experiment and the economy/ geometry of sample. Hunter.Appendices 229 “The Nature and Origin of Standards of Quality. 9.” June 1942. Pages 44 and 45 contain the graphic illustration of a continuous improvement cycle: Hypothesis (Legislative in nature). 11. Data Analysis. Deming cites the influence of Shewhart. 1939. comes directly from Some Theory on Sampling. June 1953. 1950. Fisher imagined are drawn. Deming details the geometry of sample variances on page 62. The first chapter addresses the primary importance of the design of an experiment. is taken from a series of US Agricultural Department lectures delivered at the invitation of W. Edwards Deming. Volume 37. Daniel Sloan and Russell A. Stuart Hunter. W. 1986. and Fisher. 10. Clarence Irving Lewis. and J. © M. The pictures R.” The American Statistical Association Journal. His PDCA improvement cycle dates to Aristotle. is a noteworthy historical book. All Rights Reserved. Out of the Crisis. Test Hypothesis (Judicial). Statistics for Experimenters.” was written in 1935 and was published in the January 1958 issue of The Bell System Technical Journal. On the “Distinction between Enumerative and AnalyticSurveys. gives Deming’s vision of a quality controlled health care system. Edwards Deming’s Some Theory on Sampling. This is a best-of-class book on systems thinking. It was directly affected by Fisher and Shewhart’s work. He describes the character of the continuous improvement cycle as legislative. and Model Building. An Introduction to Design. 1978. and judicial. is absent from this work. Number 218. Boyles. George Box. 2003 . Statistical Method from the Viewpoint of Quality Control.

This is the definitive 20th Century work on the Big Bamboozle. Mr. is the only therapy model and/or psychological theory we know of that was developed using probability theory. This model for rapid improvement works well in systems of any size. Induction precedes deduction. Negotiating Agreement without Giving In.230 Appendices Many of the important algebraic expressions Fisher wrote are translated. The Pythagorean Theorem provides sound theory for all standartd statistical theory. Fisher’s main point. One of the book’s essential main points. How to Lie With Statistics. 14. Daniel Sloan and Russell A. Roger Fisher and William Ury. Darrell Huff. Somehow Fisher’s ideas are simplified. is hidden from view. and flow diagrams. inductive reasoning. Rather one should focus on solutions and doing more of what works. deShazer formally opposes a focus on defects. 2003 . All Rights Reserved. 13. 1988. defectives. 12. Boyles. © M. and problems. Steve deShazer’s Clues: Investigating Solutions in Brief Therapy. Getting to Yes. This is the handbook for teaching people how to bring Six Sigma breakthroughs to fruition.

Boyles PhD. Box. http://www. Getting to Yes. Boyles.itl. Milwaukee. Evidence-Based Decisions. 2.. Inc. 2003 . 2. The Getting to Yes Workbook by Roger Fisher and Danny Ertel. Wisconsin.cfm?Sele ctedProductID=9 6. (1995) ISBN 0-14-023531-0 4. Evidence-based Decisions. (1932) ASQ Quality Press. How to Lie With Statistics. Daniel Sloan and Russell A. Darrell Huff. (1984) ISBN 0-8050-0500-5 Must Read. by Jack Botermans. Sloan Consulting and Westview Analytics. ISBN 0393-31072 5. M.(1978) ISBN 0-471-09315-7 © M. Statistics for Experimenters.gov/div898/ handbook/index. A. Profit Signals. Bedrock Classics: 1. Shewhart.htm 7. Complete.org/Lean/Bookstore/ProductDetails. Learning to See Lean Value Stream Mapping work book http://www.nist. easy to follow instructions for making 48 different models that fly. Daniel Sloan and Russell A. All Rights Reserved. Engineering Statistics Handbook.Appendices III. W.lean. Master Black Belt. Negotiating Agreement Without Giving In by Roger Fisher and William Ury. 2003. Hunter and Hunter. Free PDF download on-line Internet Resource. Economic Control of Quality of Manufactured Product. (1954). (1991) ISBN 0-14-015735-2 3. How Evidence-based Decisions Power Six Sigma Breakthroughs. Optional Show Stopper: Paper Flight. Inc. Getting Ready to Negotiate. Six Sigma Black Belt/ Expert 16 Class Curriculum Outline 231 Black Belt Core Study Texts and Free On-Line Resource book 1.

Therefore.0 http://www. definitions.232 Appendices Software Recommendations: Superior software is essential to breakthrough improvements and bottom line business results. Microsoft Excel. Portable Document Format (PDF).jmpdiscovery. 2003 . reading.com/index. All Rights Reserved. Our course is distinguished by the speed with which Black Belt candidates produce bottom line business results. is a requirement for printing. http:// www.html As of September 2003. http://decisioneering. Course content covers the American Society for Quality’s (ASQ) Six Sigma Body of Knowledge and uses Bloom’s taxonomy of knowledge: Knowledge Black Belt Experts must be able to recognize terminology.com/ Ease-of-use and a short learning curve makes this program desirable for some executive champions. Another application. ideas. note taking and electronic file attachments. Quality America’s Excel SPC-IV add-in. It is capable of handling virtually all of the analysis work required in Six Sigma breakthrough projects. the de facto standard.com/ We happily accommodate customers who prefer this excellent application.minitab. JMP 5. Measure.com This multi-variate. Crystal Ball by Decisioneering. Minitab.0. principles and methods. Our course is published for students using Adobe Acrobat 5. financial simulation tool is superb for enlisting and retaining finance leader support. Analyze. Six Sigma leaders must know how to use Excel and its add-ins. Improve and Control. This tool can be an excellent guide for project selection. is also available. Daniel Sloan and Russell A. The rigor and relevance of the course content are structured around the proven Six Sigma DMAIC cycle: Define. http:// qualityamerica. © M. Boyles. We believe this vector analysis program is the best in class.

Scatter Diagrams. The 5-Minute PhD: Vector analysis applied to a data matrix. and concepts on the job. diagrams. Calculate the Standard Deviation: s and sigma. reports. JMP 5. All Rights Reserved. 2003 .5.0. 1. the ANOVA. and computing literacy are key. and directions.5. Boyles. Evaluation Black Belt Experts must be able to make judgments regarding the value of proposed ideas and solutions. 1. 1.3. 1. Learning Objectives: Theory and practice. Daniel Sloan and Russell A. 6 Sigma Analysis. Synthesis Experts must expose unseen and informative patterns.3. Calculate Improbability – The F ratio © M. Statistical reasoning. 1. Overview. σ.2. and History – A Six Sigma Gestalt 1.1. Introductions. analysis. Black Belt Course Outline 1. Correlation. (or Minitab 13) software navigation are introduced. methods. Defining Six Sigma: Introduction.4. Calculate the Mean.Appendices Comprehension 233 Experts must be able to understand tables. Application Experts must be able to apply principles. Control Charts.1. Regression. The Complete Six Sigma Tool Kit: Categorical Catapult Experiment: 23 Designed Experiment (DOE).Recognize that the mode and median exist 1. Four Essentials in a thorough.2. Histograms. Pareto charts. 1.5.5. Inductive and Deductive reasoning. Analysis Experts must be able to break down data and information.

1. © M.3. X2…. Specific. Check.6.3. Do. Where you want to be in your future? 1. Where you are today? 1.6. Evaluation and Action.6. The IDEA cycle: Induction.6.4. The scientific method: Hypothesis.7. 1. 1. Act or Plan. and Time Bounded. 1.234 Appendices 1.11. All Rights Reserved.2. Test Hypothesis. Designed Experiment Homework. Act 1. Outputs and Customers 1. Interactive Dialogue: Assessing evidence in your corporate culture. Graph Data in meaningful ways that illustrate the mean.2.M.4. 2003 .1.3. 1.R.7.2. philosophy. Measurable.1. Every class participant will complete his or her first breakthrough project this evening. Process. Deduction. Calculating the Priority Projects Using Excel matrix 1.7. Improve. Boyles. Measure.10. An Enterprise View: Suppliers. Relevant. Selecting and Leveraging Projects 1.10.3. 1. projects 1. Experiment. PDSA or PDCA: Plan. Analyze. Control. Vector Analysis applied to a Data Matrix 1.10. Analogy (1931-2003): Legal System Decisions 1.9. Achievable.6.3. 1.Xn) 1.10. Study.1.6. standard deviation and probability information. Analogy: Management System Decisions 1. Daniel Sloan and Russell A. Six Sigma: History. Inputs.10.7. Results will be recorded and analyzed using JMP or Minitab for class presentations during in class 2.T.7. Project Charters and Planning Tools Gantt and Performance Evaluation and Review Technique (PERT) Charts 1. S. The Six Sigma Lucrative Projects Results Map 1.7.10.8.1.1 Y = f (X1.10.A. goals and models. Brainstorming 1.8. Do.2.4. In class demonstrations are mandatory. Standards of Evidence: Evidence-based Profitability Principles. Lucrative Project Selection 1.5. DMAIC : Define.5.

Getting to Yes workbook reports.4. Vision Research Tutorial 2.3.3.3. By the end of the day people have memorized software keystrokes for either Minitab or JMP.6.1. Hands on corporate example and demonstration © M.2.3.5.Profit Signals workshop to Include Chief Financial Officers and/or Controllers 2.3.2. The Continuous Catapult Experiment: 23 Designed Experiment (DOE) 2. Boyles. Green Belts 2.7.3. Typically these are spread out through the entire day.4. Designed experimentation demonstrations from home. Crystal Ball budget building Decisioneering Tutorial Review 2. Weaknesses 2.1.5.5. 2.7. Black Belts 2.3.3. 2. How to Lie with Statistics reading assignments and discussion.Old Equation Comparison 2.3.5.1.7. 2003 . 2. Both yield identical answers. All Rights Reserved.5. Threats 2. Learning Objectives 2.” led to the martial arts metaphor.4. Master Black Belts 2. Strengths 2.4. 2. The New Management Equation . Accuracy and Precision. Futura Apartments Tutorial 2.1. SWOT analysis of Sub-optimizing systems. 2.2.1 What is different about 6 Sigma and other problem solving tools? 2. 2.1. Homework Experiment Presentations using software 2. 2.4. Predicting the future with the Profiler.1. Executive 2.3.4.3.2. Both are easy to master. 2.5. Linking Organizational Goals and Objectives to Six Sigma 2.3. Define: Organizational Responsibilities and Financial Six Sigma. Six Sigma Language. Opportunities 2.2.4. “Kickin’ the heck out of variation. Closeda nd Open Loop Feedback Systems 2. Class dialogue on cultural norms and issues related to this topic. They are reliable as the sunrise and sunset. Daniel Sloan and Russell A.1.5.5.2.4.7.1.1.Appendices 235 2.2.5. Champions 2.4. Leadership and Job Descriptions.7.

2.2. Iterative learning and fun.1.5.1.2. build.1. Brainstorm and draw one.. Analogies. Negotiation techniques for Success: Getting to Yes. Build Relationships and BATNA 3.2. Six Sigma is a Business Initiative NOT a quality initiative.2. 3.8.2.3.M.236 Appendices 2.8. Practical Applications. Debriefing. Profiler: Optimization and Desirability 3. Management Reviews 2.6. Wise. 3.1. Project Timelines 3.1. 3.2.3. Statistical Software Application practice. 3. 2003 . All Rights Reserved. 3.8.8. Predicting the Future with categorical and continuous variables. Story Boards 2. A Catapult 25 DMAIC Experiment.A.6.5. Jack Botermans.1.Paper Airplane Homework Presentations 3.5. easy to follow instructions for making 48 different models that fly. SIPOC Diagrams (Supplier. Inputs. Spreadsheets 2. Project Documentation: Data. Boyles.6. The American Society for Quality’s Black Belt test is discussed. Phased Reviewed 2. 3.1. 3.4. Projects and the SIPOC diagram. and evidence do not speak for themselves. Consequently.R. 3. Intuitive and counter-intuitive solutions. S.2.2.8. Design. Process. and Customer) 3. and Analysis. As of 2003.4. © M. 3.3. Visualize and plan your breakthrough project presentation. analysis. Daniel Sloan and Russell A. we cover the entire list of recommended tools.2.1.. (1984) ISBN 0-8050-0500-5 3.3 10 Principles for Getting to Yes 3.5.6. Learning Objectives 3.1.4.3. Defining Six Sigma Project Selection and Benchmarking 3.8.4. there were no questions related to vector analysis or the data matrix. 3.1. 2.2. Efficient. 3. Executive Team Presentations 2. The 5 Whys 3. Dialogue discussion.8.1.5. Outputs.2.1 A Complete Six Sigma Pilot Project – Synectic Experiment Paper Flight. Begin building a Crystal Ball model related to potential Six Sigma projects. 3. Homework.T.1. and fly paper airplanes according to your experimental array with your team.5. Complete.

3.8.7. 3. Defining Process and System Capabilities 4.1.7. Flow Diagramming the Production Process 4. Comparing Machines.1. Boyles.10.9. 3. Plants.5.8. Plant Visits and interviews. Frame your reports accordingly.1. Brainstorming Critical To Quality Flight Standards 3. Defining . Benchmarking – Process Elements and Boundaries. Independent Evaluations and public Financial Reports. Operational Definitions – Critical to Quality Characteristics 4. and Analyze Experiments. Repetition for mastery using candy M&M Sampling.8. Textbook DMAIC breakthrough Case Study presentation.1.Design for Six Sigma 3. Standard Deviation.4.5. 3.6.9.8.2. Graph 3.8.2. 3. Project Charters and Paper Work. Hands-on Define. Measure.2.2. Practice with JMP 5. The Complete Six Sigma Tool Kit: Vector Analysis Applied to a Data Matrix.7.8. and Time Bounded.2. Probability.8. 4. Populations versus Samples 4. 3.8. Daniel Sloan and Russell A. Review of Homework and Reading 4. Product Tear Downs and published books.8. All Rights Reserved. Ground Rules for Nominal Group Technique 3. Relevant. Achievable. Sampling our population of candy.10.1.7. Internal Best Practices using the complete Six Sigma tool kit. Improvement 3.Appendices 237 3. 3.4.6.3.8. Voice of the Customer (VOC) 3.2. Histograms © M. Project homework and reading assignments set.7.3. DMAIC is what your customers expect to see. Specific. 3. 3. and Shifts 3. Literature Searches: Internet and Company.3.8. Analysis: Mean.3.2. 3.3. 4. Measurable.8. Measurement – Performance Metrics and Documentation 3.8.7.3.3.11. Sorting.3. and Analyzing. Process Characterization and Optimization. Control 3.0 Software Application 4. 2003 . Learning Objectives 4. 3. Production Lines.

4. Boyles. JMP 5.1.3. proof reading example.7. 4.7.6.9.4. and Data Mining Training 5. presentations. Standard Deviation.9. Practical Applications using Dice 4. Six Sigma Values. Understanding the context of multiple variables is the key to breakthrough improvement projects. 4. Learning Objectives: Vector Analysis applied to a data matrix and Evidence-based decisions. 2003 . Calculating Defects per Million (DPU) Opportunities 4.3.3. 4. 8 4.9. Probability.3.1.5.2.9. 5.1.7. 4. 5. Control Charts 4. Defects Per Unit 4.1. The DMAIC Breakthrough Chart 4.10. Daniel Sloan and Russell A. eight factor DOE analytic sampling. and dialogue. Calculate and graph Cpk for select individual copters. Mean. Juran’s Trilogy 4. Emphasis of key concept.6.8. Present results of tools applied in daily work. Project selection updates including Crystal Ball model. 5.10.1. 4.2. Quality Function Deployment.5. 2 Helicopter Designed Experiment 4. and Graphed Results. 5. Pareto Charts 4.3.4.2.10.1.5. Define: Negotiation. 23 Designed Experiment: Comparing the value of systematic observation with simple arithmetic counts. Observe Designed Experiments DMAIC Demonstrations by Students 5. Confidence Interval introduction. 4.4. Cp and Cpk 4.1. All Rights Reserved.2.7.3. Helicopter 23 Confirmation Experiment © M.7.8.1.238 Appendices 4. Shewhart’s P-Chart 4. Calculate and graph Cpk for all 16 copters. Project Selection Focused Homework on Process Capability 4.0 or Minitab Calculation Practice 4. Homework reports. Scatter Diagrams and Correlation Coefficients 4.2. Homework: Read Quality Function Deployment white papers for report. Compare M&Ms Enumerative Sampling with two-level.3.2.4. 4. Motorola’s classic.8.

5. © M.5.A One proven method of encouraging concurrent engineering.6.5. Force Field Analysis – Forces Fighting Change 5.8.1.4.Appendices 239 5. 5.5. standard deviation.4.7.3. KANO Model of Quality 5.3.7.8. Homogeneous Fields and Records.3. Innovation Adoption Model 5.5. Change Agent Methods 5. design for maintainability. 5. 5. design for test. 5. and analytic graph.2.8. The Four Houses of Quality.6.5.2. Motivation 5. How does this analogy apply to your work? 5.9. Using and Excel Template 5. 5.5. 23 DOE Data Mining Demonstration and practice. Building a House of Quality . Understanding and overcoming Road blocks 5. The Hows 5. design for manufacturability. 2003 .4.5.6.6. 5. Diffusion of Innovation 5. Bring data in spreadsheet formatted for data (sorting) mining practice.4.7.5.7. Adoption Process 5. Cultural Influences 5. 5.8. probability.1. 5. Homework: Outline project selections for class presentation.6.5.6.6.5. A correct vector analysis: Thorough 6 Sigma Analysis: The average. The Four Phases. Columns and Rows 5. Excel Data Sorting Function – A brief history of data mining.2. Design for X (DFX): Design Constraints.7.10. All Rights Reserved. Negotiation – Getting to YES. Orthogonal Arrays 5.2.1. Daniel Sloan and Russell A. Functional Requirements and Robust Design 5. The Whats 5.6.1.6.7.6.6.5.1. Aristotle. Boyles. Change Agents and Team Leadership: Pythagoras. 5. to Frederick Douglas and Harriett Tubman to 2004. Communication 5.4. “Correlation matrix” Trade Offs 5.5.7.5. Iterations and efficient learning.

& Costs of Quality 6.1.12.2.5. Relevant to business in financial. interview 6.9. 6. Uncovering the “Hidden Factory” 6. Excel Spreadsheet template available. Sorting.8.Lean Flow Charting Fundamentals 6. quality.4.12.1. Daniel Sloan and Russell A. Customer needs. Categorical Thinking 6.12. Internal Failures 6.6.240 Appendices 6.7.4. 6. mailing. Collecting. Drawing the Value Stream . Detailed walk through of an exemplary Cost of Quality corporate report.2. Surveys: Telephone. Affinity Diagram Experiment 6. 6.3. Boyles.3. 6.1. Thought Process Mapping 6.10. Costs of Poor Quality 6.6. Observe the machine. Drivers. Quantified CTQ 6.3.3.11. Identifying Critical to Quality Characteristics (CTQ) 6. and productivity terms. 6.4. © M. Critical to Quality Tree 6. Prevention Costs 6.2.12. Central Limit Theorem Simulations using the machine and Decisioneering’s computerized demonstration model. Drive Down Costs Red Bead Sampling Game – Drive Down Costs 6.3.1. 6.8.7.7.4.7.1. Measuring Value: Rolled Throughput Yield Metrics. External Failures 6. DMAIC Comprehensive definition process to determine variables and outcomes.1. The 24 Quincunx Machine Experiment for JMP or Minitab practice. 6. Appraisal Costs 6.12.2.4.2. Universal Standards of Measurement.1.3. Homework presentations and review of data mining strategy.5.. Poisson Computer Simulation on Defects per Unit 6.3. Learning Objectives 6. 2003 .12. Rolled Throughput Yield (RTY) 6.3. Cause and Effect Diagrams 6.6.1. Interactive role playing using a game of historical significance. Developing and Translating Customer Information 6. 6.4.2. All Rights Reserved.1. Brainstorming 6.

3. Goals of Process Design 7.3. Process Categories 7.2.8. Why use a process model? 7.1.10. A Process Definition Tool 7.12. 7.7.12.13. Definitions 7.4. Balance document needs 7.7.2. Process Boundaries 7. Workshop Purpose and Agenda 7. Document design rules 7.4.13.3.1.5.2.4.1.4. What is process mapping? 7.4. Learning Objectives 7.4.9.5. Process model revisited 7.2.4.3.1.2.3. What makes a process reliable? 7.7.4.1. Homework Focus on Quality Costs Project Results Measure: Process Mapping 7.3. Homework Reports 7. Boyles.Appendices 241 6. A documentation survey tool 7.5. Why use process maps (flow diagrams)? 7. “Global Process Requirements 7.11.2.3. Taguchi Loss Function Example 6.3.4. Why be concerned with information? 7.2.2. Define the process 7. 2003 .3.4. Outline for process definition 7.14.3. Techniques for process mapping 7. Process Model (SIPOC) 7.12. What is a parallel process? 7.9.6.2.2.4.2. Structure your information 7.5.1. Flow charting the primary process 7. Exercise in process definition 7. Systems and Processes.1. Phillip Crosby’s Rule of 3 6.4. Introduction 7. A Primary Objective 7. Quality Cost Statement by Product Line 6.4. The mapping method 7.3.1. © M.12.2.3. What is this common process? 7.4. Process Customers 7.4.7. Define the Process 7.5.10.6. Daniel Sloan and Russell A.6.4.6. Systems Thinking 7. What is your purpose? 7.2 Process and System Concepts 7.2. Why document a process? 7.8. All Rights Reserved.4.2. Documenting Processes 7.

12. Exercise: Alternative paths 7.29. PERT chart 7.15.13.10.1. Add control points 7.16. Exercise: Define responsibilities 7.4. rework.10.5.5.2.5.11.5.21.5.20.6.6.242 Appendices 7. Exercise: Control Points 7.6. Introduction 8.24. reverse loops.28. A process analysis tool. 7.5.1.4.2. Technique #2: Standardize 7.26. Adopt and use standard symbols 7.3. Using alternate formats for process mapping 7. Geography flow diagram 7. Measure: The Productive Team Member 8.25.5. Elimination targets: waste.5.5.7.4. Key implementation points 7. Flow charting alternative paths 7.1. Exercise: Primary Process 7. Using maps to improve and streamline processes 7. Data flow diagram 7. Technique #5: Prevention 7.5.5.4.6.19.7.14.4.5. All Rights Reserved.7. and needless complexity. Technique #6: Analyze inputs 7.9. Other useful symbols 7. Technique #3: Using the map 7. Standardized Process Chart 7.8.4.6.1 Workshop purpose and agenda © M. Technique #1: Value assessment 7. Technique #4: Early control 7. Cross-functional flow chart 7.17.27. Controls: Some considerations 7.18. Daniel Sloan and Russell A.6.6.4.4. Goals of process analysis 7.4.8.4.6.6. Boyles. Example 7.22.23. delays.6. Responsibility matrix 7. Example 7.5.3.4. Top-down flow chart 7.4. 7. Writing good narrative 7.6.4.4.9. Simple flow chart 7. Characteristics of a good flow chart 7. Finish the flow chart 7. 2003 .4. The decision question 7.6.4. Types of maps 7. Homework 8. Decision tree 7.1.5. Example 7. Exercise: Remap 7.4.5.

18. The Four Room Apartment 8.9. A Sample “Code of Cooperation” 8.2.2.3.2. Team Roles and Responsibilities Meeting Management and Leader skills 8.2.9.2. Five Approaches To Getting Unstuck 8.19.2. Communication Breakdown 8.2.2. Boyles.9.2.11.2.2.14.2.12.2.12.9. 2003 .1. Improving Team Performance 8.9.9.2. Learning objectives 8. Learning Style Inventory 8.18.5.9.3.9.4.2. Improving Communication 8.9. Ground Rules for Consensus 8. Box Of Stuff exercise 8.2.2. Groupthink 8.2.7.13. Team Self-Evaluation 8. Transition 8.2. External Forces For Change 8. Stages of Team Development 8. Communication Model 8.4. What things the Team must Manage 8.9.6.8. Overcoming Hindrances to Team Performance 8.2.15. Inputs for a Successful Team 8.17. Circle In The Square exercise 8.9 Four Stages of Team Development 8. Building “I-Statements” 8. Daniel Sloan and Russell A. Practicing Feedback 8.2.9.9. Signs of Team Trouble 8.9.9.2. How To Correct Bad Listening Habits 8.2.2.2. Types of Feedback 8.11. Homework report (six sigma project progress) 8.9.15.10. Competition versus Cooperation 8.2.2.16. Change vs.2.2.1.9. Principles of Large-System Change 2.1.9.9.3.16.2.9.8 (Murder Mystery exercise) 8. The Johari Window 8.20.9.2. Teams vs Groups 8.9. All Rights Reserved.2.2.2.14.9.2.2. Internal Forces For Change 8.9. Sense Of Urgency—Good Or Bad? 8.2. Barriers to Good Listening 8.13.6.2.9.2. Member role and responsibilities 8. Norms and Team Development 8. Characteristics of Effective Teams 8.8.9.2.9.9. Strategies for Managing Change 8.9.5.7.2. Close © M.2.17.Appendices 243 8.1.2. Outputs of a Successful Team 8.19.9.9.9.

fmeca. Review output of product.2.2. interval.2.2.3. Measuring the Process: Failure Mode Effects Analysis FMEA Workshop.2. Learning Objectives 10.3.1.6. Real-world examples 10. 10.4. 10. 10. 10. 9.4.2.5.1. ratio 10.3.5. Criticality is included and emphasized.3.244 Appendices 8. Homework Review – Focus on Project Financial Results 10. Ordinal 10.2.1. Collecting.2. Stem and leaf diagram © M. Boyles.3. Fit a normal distribution to measurement data and assess goodness of fit.1. 9. Walk through of the entire FMEA process will include group work tools and methods introduced in effective team member class. Traditional taxonomy 10. Analyze: Exploring. Use software to explore data basesdatabases. Review process capability concepts with working exercise. Time to failure = life data 10.4. Review of graphics for measurement data 10.4. 10. History.1. Be able to give examples of continuous. Review: Types of Data 10. 10.2. 9. and life data. Recording and Analyzing Measurement Data 10.com/ Read as much as you can prior to the next class. Homework: Present project progress and estimated dollar savings using tools.5. 10.2. 10.9. count.3.3. pass/fail. 2003 . 9. More useful modern taxonomy 10.2.3.5.4.2. Definitions and Acronyms.2.1. and Predicting using data. Assign Homework (Six Sigma Project Progress using appropriate tools) Visit http://www.2.3.2. Use correct graphics to summarize measurement data. Daniel Sloan and Russell A. Summarizing.3. 9. Nominal. categorical. 9. Attribute = categorical = discrete = nominal 10.2. Explain central limit theorem using coin tosses.1. All Rights Reserved. ordinal.1. Continuous = measurement = parameter = variable 10.1.3.

Cumulative or “rolled throughput” yield (Review and Reinforcement) 10.2.0. pass/fail. JMP Data Exploration Exercises 10.8.1. maximum and range 10. All Rights Reserved. Explain relationships between processes and populations.5. Gauge Reproducibility and Repeatability Studies and Practice 10. Review of descriptive statistics for measurement data 10.2.2.2.1.6. 10.3.2. fraction defective or reliability.0.9. Mean and standard deviation 10. Process Sampling 11.3.2. Generating descriptive statistics and graphics 10. Frequency histogram and Cumulative Distribution Function (CDF) 10.6.3. Homework Review Focused on Project Results 11. three sigma limits.7.7. Yield calculations for two-sided specs 10.0. Definition of a measurement system 11. Entering data in rows and columns 10.4.6.1.6. data entry.2. Boxplots 10. Minimum.3. Central Limit Theorem.3.0. coin tosses and Process Capability 10.2.4.3. 11.3.3.8. 11.3.1.4.Appendices 245 10.3. Analyze: Inductive Reasoning Part 1 – Quantifying uncertainty in measurement systems (Formerly known as Hypothesis Testing) 11. 11.4. Identify default statistical models for measurement. capability indices.8. Measurement objectives © M.7.2.5.2. and worker variation. Measurement Systems 11. and Power Point 10.8. standard deviation. Producing a report: Integrating with Microsoft Word. 2003 . 11. calipers. Boyles. Express real-world problems in terms of statistical models and population parameters.3.1. Workshop: Wooden sticks. Plus and minus three standard deviations 10. Learning Objectives 11. Daniel Sloan and Russell A.2.1. Yield calculations for one-sided specs 10.8. Population sampling 11. Use Confidence Intervals to characterize or test a process in terms of mean. Excel. Homework Focused on Project Results 11.7. count and life data.6.

5. Confidence and Evidence 11. 12. Hyper-geometric.3. Homework Focused on Project Results 12.4.4. Interval Estimation 11.5.4.6. 11. Chi Square and t distributions 11.2. Choose appropriate test procedures based on type of problem and type of data.4.4. P-values © M.1. 12.2.2.5. and relating variables.3.2.2.1. 12.5. 11.1 Repeatability: dependability of the gauge 11. Poisson. 12. comparing processes.1.4. Use p-values to interpret the results of statistical tests.5.3. Interpreting Opinion Polls 11. Measurement Uncertainty 11. and be able to classify them as testing an objective.6.3.4.7. comparing processes. The law of likelihood and likelihood function. Statistical Hypotheses and Process Hypotheses 12. 12.3.6. 12. The role of calibration procedures 11.4. Learning Objectives 12. The Normal. 11.5. Daniel Sloan and Russell A. and Weibull distribution models 11. Recognize statistical problems when they occur.4.5.2 Reproducibility: dependability of gauge operators and environment 11. 11. binomial.2. Accuracy and precision (Review and Reinforcement) 11.4.4.3.1.5. All Rights Reserved.2. Explain the difference between correlation and regression. Analyze – Inductive Reasoning Part II 12.1.3.3. 2003 . 12. Identify appropriate null hypotheses for testing an objective. Repeatability and Reproducibility (R&R) 11.2.8. Sample Size Calculations 11.4 1. or relating variables. Characterizing and Testing exercises 11. Statistical Inference 11. Examples and exercises: Calibration and Calibration Control: Penny for your Thoughts workshop exercise.6. The null hypothesis.2.6. Pass/Fail 11.4. Boyles.2. The “one-sided” fallacy.4.246 Appendices 11. Homework Review Focus on Project Results 12. Fair coin tosses. The Seven Habits of Highly Statistical People: Quantifying Uncertainty.3.

4. Data Mining. Smallest difference of practical significance 13.Model Building. F test for equality of two Normal standard deviations 13.1. Analyze .3.6. z test for equality of two Poisson means (valid only for large sample sizes) 13.2.6. Daniel Sloan and Russell A. Boyles. All Rights Reserved. Likelihood ratio test for equality of two or more Poisson means 13.2. Number of Defects 13. Likelihood ratio 12.3.3.4.3. P Values from the F statistic 12.2.4. ANOVA The geometry of analysis 12.2.6. Operational interpretation 13. t test for equality of two Normal means 13. 247 13. Example: comparing two opinion polls 13. Power of detection 13.2.1.5. Law of likelihood in comparison problems 13.1. F test for equality of two or more Normal means (Analysis of Variance) (valid only if all standard deviations are the same) © M. P values from z or Chi squared distributions.1. Confidence interval for a difference 13.5. 12.3.2.2. 2003 .3.3.5. z test for equality of two Binomial proportions (valid only for large sample sizes) 13. Homework and Six Sigma Project Progress Review 13.2.4. Relating variables.2.5.3.2. P values from Z statistics 12. Test for equality of two or more Binomial proportions (valid only for large sample sizes) 13.4.3. Z statistics and the Z transformation 12.3. and Linear Regression 13. Continuous Measurements 13.4.3.6.5.4.1.5.2. Degrees of Freedom 12.1.4.4.3. Sample Size Calculations 13.3.4. Hypothesis testing revisited 13. Pass-Fail Data 13.1. P-values 13.6.Appendices 12.2.4. Mathematical definition 13. Likelihood ratio test for equality of two or more Binomial proportions 13.1. Quantifying the Strength of Evidence 13.2.2. Test for equality of two or more Poisson means (valid only for large sample sizes) 13.4.

Straight-line regression 13.6.10.8. Tests of association in contingency tables 13.1.1. Confidence intervals for predicted mean values 13.1.3.10.7. Homework Review Focused on Projects 14.8. Calculate sample sizes for optimization experiments.10. Homework with Project Focus 14. Predicted mean values 13.1.5. Regression Analysis 13.1.6.2.4.10.4.3.4.” 13. The dangers of R2 13.8.10. Interpreting the table of the Chi-square distribution 13. JMP exercises 13.4. 14.6.10. Introduction to Experimentation © M.1. Chi-square tests 13.5.1. The least squares estimates 13.6. All Rights Reserved. Multiple regression 13. 2003 .1. Testing for lack of fit 13. Boyles.10.10.3.3. Create matrices for optimization experiments. 14.3. The RMS error 13.2.10.10.10.10.4 Polynomial regression 13.3.10. 14.7.3.1. some are useful.10.4.3.1. Life Data (Time to Failure) 13. “All models are wrong. Workshop: Pennies for Your Thought 13. 14.10. Fitting Regression Models 13.10.1.1. 14. Workshop: Pennies for Your Thought 13. Confidence intervals for predicted individual value 13.10.10. Residual plots 13. Testing for significance of predictor variables 13. Daniel Sloan and Russell A.4. Likelihood ratio test for equality of two or more Weibull distributions 13.2.3.8.2.10. Scatter Diagrams (Review and Reinforcement) 13. Analyze data from optimization experiments.10.3.10.2.2. Linear Regression Models 13.4.248 Appendices 13.10.4. Learning Objectives 14.2.4. Interpret and apply results from optimization experiments.1. Correlation is not causation 13.9. Regression diagnostics 13. Improve – Experimental Design and Analysis 14.5.6. Be able to explain the difference between optimization and screening experiments.7.

2 Exercises 14. Noise 14. Concepts and Definitions 14.8.8.8.4.5. Types of factors 14.4.1 Why should I do experiments? 14. Create matrices for robust optimization experiments.4. Calculate sample sizes for robust optimization experiments. JMP Steps 14.8. Categorical 14.4. Experiments with All Factors at Two Levels 14.1. Design matrix 14. Blocking 14.2.8.7.4.7.7.1 JMP Steps 14.1.1.9.5. DOE Terminology 14.4.7.1.2.1. Randomization 14. All Rights Reserved. Improve .9. Response 14. Workshop: The Funnel Process 15.11. 15. Design principles 14.3. Perform multiple response analysis.4. Experimental Unit 14.6.1.2 When should I do experiments? 14. Describe iterative strategy for experimentation 15.9. Design Point 14.3. Replication 14. Bold strategy 14. Control 14.7.5.7.7.4. Screening Experiments 14. Control group 14. Examples 14. Learning Objectives 15. 15. Level 14. Basic Design Process 14.3. Sample Size 14.1.Appendices 249 14. Examples. Factor 14.Process Optimization and Control 15.3.4.1. Exercises 14. Boyles.4. Daniel Sloan and Russell A.5.2.1.5.3.7.1.4. © M.10. 2003 .6.2. Modified Design Process 14.3.3. Factorial structure 14.4.1.4.5.5.10. Do not experiment with one factor at a time! (OFAT Review and Reinforcment) 14.6.2. Continuous 14.

4. Analyze data from robust optimization experiments 15. Example and JMP exercises 15. Boyles.1. Quadratic models for continuous factors 15.2.1.4.Optimization Experiments and Statistical Process Control 16.6.3. 16.3.5. Exercises 15. Multi-level Optimization Experiments 15.5.4.2.5. Models for categorical factors 15. Response surface analysis 15.4. 15.3. Standard assumptions 15. Multiple response analysis. Predicted values and residuals 15. Control .7.3. Review of Designed Experiments Homework 15.10.5.1.7.4. Continuous × categorical interactions 15.4.8. Statistical Testing 15. Daniel Sloan and Russell A.9. Testing for lack of fit 15.6.5. Exercises 15.1. Types of experiments 15.5.3. Statistical Modeling 15.2.4. Continuous × categorical interactions 15. 16. 15.8. Models for continuous factors 15. Rational sub-grouping. Homework: Design of Experiments Project Focus: Report project results in DMAIC format for final class.3. Quadratic models for continuous factors 15.3.8.1. The method of least squares 15. All Rights Reserved.250 Appendices 15. Workshop: the Funnel Process using robust optimization and quality control.1. Design process 15.3.7.5. Testing model coefficients 15. Learning Objectives 16. Sample size calculations 15.3. Monitoring low failure rates.4.3.8. The experimental cycle 15. 15.1.1. Establishing baselines 16. 2003 .7.1. Strategies for experimentation 15.1.1. Understand common cause and special cause variation.1. The Process of Experimentation 15.2. process improvements 15.11. Example and JMP exercises 15.5.2.7.1.2.1. 16.4. © M. Describe a Reaction plan to out of control conditions.2.8.

5.2.5.5.4.4.1.4. Boyles.3.3.5.6.1.5.2. 16.2. © M. Strategy for design of robust optimization experiments 16.3.5.2. Strategy for analysis of robust optimization experiments 16. The concept of robust optimization 16.5.3. Robust Optimization Experiments 16. Short-Run SPC 16. Acceptance sampling and broken promises.5. Statistical Process Control as a mind set and strategy. Apply multiple response technique 16. Optimize the mean 16. Example 16. Optimizing over subsets of the design region 16.5.2.4. Constructing the overall desirability 16. “Multiple responses” is the rule.4.6.1. Review of Designed Experiments Homework 16.3. not the exception 16.5. Examples 16.6.3.Appendices 251 16.5.3.5. Identify key noise variables 16. Thought process for designing an experiment 16.4. 16.1.1.1. Maximizing the overall desirability 16. 2003 . Homework: Design of Experiments Project Focused Report project results in DMAIC format.5. JMP exercises 16.2.5. Minimizes variability for a given mean 16.4.4. Minimize the variance 16.3. Daniel Sloan and Russell A.2. Define noise factor 16.1.2.6. The three types of response objective 16.3.5.5. More on Sample size calculations 16.1. Maximize overall desirability 16.3.4.3.2. Optimizing one response at a time will not work 16.5.3.5.4.4.4. Constructing a desirability function for each response 16. Seeks best combination of close-to-target mean and low variability 16.3. Desirability functions 16.1.2.1.1. All Rights Reserved.5. Include noise factor in the design 16.2.5.5.1.5. Multiple Response Optimization 16. Hands on SPC experiments and software practice Workshop: the Funnel Process Exercises Summaries for Quick Reference 16. Multivariate statistical process control 16.3.

art and the pursuit of happiness. Evidence-based decisions are as important to world peace as they are to prosperity. and software allowed us to complete the entire writing and production of the book in 90 days. All Rights Reserved. was our favored analytic program. just-in-time. 2003 . We carry only the inventory we need for personal. © M. democratic values. When necessary. rules-driven. the completion of our book on this day was an appropriate way to celebrate liberty. The applications that played primary roles are as follows: • Microsoft Word® 2000 and 2002 were our primary composition tools. This is the classic Six Sigma project time line. freedom of speech. software analysis application. The Internet. equality. • JMP 5. We completed the work in PDF format on September 11. Six Sigma level knowledge and skills in every aspect of the production of this book. we retained the illustration services of expert contractors. Though it was an entirely Chance coincidence. perfect bound paperback version on demand in a pull-production system. Daniel Sloan and Russell A.252 Appendices IV. corporate use.0® statistical software. computing power. • Microsoft Excel® was used for spreadsheet screen captures and some graphics. We also use Minitab with clients who have that standard. Boyles. 2003. Our lean production system included two authors. Kinko’s prints the four color cover. applied science. We produced the electronic versions of our book independently. We began by creating the Profit Signals title on June 18. manufactured by SAS. Profit Signals Production Notes We consciously chose to demonstrate Senior Master Black Belt. Using Excel for data matrix vector analysis shows the amount work required before a spreadsheet behaves like a reliable.

Statistical Process Control program was used to produce a control chart. layout and construct our book. • Quality America ‘s Excel add-in. a data matrix based flow diagramming program. technical drawings. In our opinion. • Process Model®.Appendices 253 • Microsoft Explorer was the web browser we used for Internet research. four individuals went well beyond the call of duty as we © M. a HewlettPackard LaserJet 1300 and an hp officejet v40xi jet printer produced hard copy for old fashioned proof reading and review. We love them. was used to create flow diagrams. • Microsoft Power Point® was frequently used by Russell for first draft. •Adobe Photoshop® 7. Master Black Belts and Executive Champions to use a similar list of programs in their daily work. • Adobe In Design® 2.0. and our wonderful children.0 Professional helped us disseminate copies for review. Austin and Molly. Profound thanks are due to our wives. it is not only a reasonable expectation for Black Belts. • Dell desktop and laptop computers. Boyles. In addition to the entire Adobe products technical support team. Patience is their virtue. Daniel Sloan and Russell A.0 was used for certain photographic and graphic illustrations. • Adobe Acrobat® 6. 2003 . All Rights Reserved.2 allowed us to design. • Adobe Illustrator® 10 transformed all illustrations into EPS files for production. it is essential to Six Sigma powered project breakthroughs. Lynne and Michelle. • Crystal Ball by Decisioneering® was the Excel addin we used to make this spreadsheet behave like a data matrix.

Boyles. Onwards and upwards. Daniel Sloan and Russell A. our friend. Finally. Cheryl Payseno. colleague. and final copy proof-reader Bethany Quillinan stepped into the fray to help us see our words through yet another set of eyes. She also encouraged us to tackle the cost accounting variance and break-even thinking head on with the Premise’s second illustration. very. 2003 . Cheryl. Jack. “Thank you very. our friend Bill Moore. The specificity of his constructive criticisms and the solutions he proposed strengthened the quality of our work immeasurably. colleague. and former hospital administrator volunteered her case study on Breaking the Time Barrier. Without Jack’s vision. Jack Benham introduced us in July of 2002. nurse.” © M. the President of MedCath. Incorporated. Good on ya’ matey. Our friend.254 Appendices produced Profit Signals. All Rights Reserved. Figure 2. She did a Six Sigma quality job on a pressure packed deadline. very much Lynne. Hospital Division. Austin. Michelle. So. Molly. leadership and masterful management skill there would be no Profit Signals. Bethany and Bill. volunteered invaluable editorial support.

37. 2003 . 115. 76. 132 control chart 110. 136 Calder. 79. George E. 179 C CABG 34. 113. 21. 223 cornerstone of evidence 10. 13. 90 Box. 222 ANOVA 37.Index A Adams.P. Bernard 91 Confidence Level 81. Daniel Sloan and Russell A. 33. 92. 117 Cohen. 153. 153 break-even analysis 11. 154. 14. 57 belt grinding 142 Bernstein. Peter L. 175. 161. 55. All Rights Reserved. John 217 Aladdin 179 analogy 44. 53. Boyles. 221 B Bamboozle 56. 58. 208 © M. 51 Analysis of Variance 37. 10. 118. Alexander 165 Case Studies 21. 203. 88 Archimedes 46 Aristotle 152. 174 analysis 9. 191 Black Belt 19.

168 Delusions 56 Design of Experiments 44. 177. 53. 91 Einthoven. Willem 33 EKG 33 emergency department 128 Emerson. 96. 167. Boyles. 46.256 Index correlation 177 Corrugated Copters 22. 15. 20. 29 data matrix geometry 124 da Vinci. 54. 18. 194 credulity 50 critical thinking 57 Critical to Quality 99 CTQ 99 cube 38 cynicism 118 D Darwin. 21. Ralph Waldo 222 Euclid 46 evidence-based decision 10. 18. Walt 47. 16. 14. 19. 53. 119. 194 cost-accounting variance analysis 11. Daniel Sloan and Russell A. 74 Disney. 23. 22. 48 differences 32. 179 Cost of Poor Quality 111 Cpk 114. 23 Executive Committee 209 F Fads and Fallacies 85 Failure Mode Effects Analysis (FMEA) 112 feedback 136 © M. Albert 43. 178 Disraeli 118 DMAIC 21. Charles 178 data matrix 9. All Rights Reserved. 13. 130 E Einstein. 11. 2003 . 21. 12. Leonardo 43 defects per million 92 degrees of freedom 65.

Henry L. 178 generalization 9. 110 Feynman. Francis 55. 97 Generalization 9. 152 257 G G. 43. William 32 Gould. Ronald A. Thomas 217 © M. Sir Austin Bradford 135 Huff. Richard P. 39. 43 Five-Minute PhD 20. Daniel Sloan and Russell A. G.Index Feigenbaum. Stuart 74 hyperspace 38. J. Robert 87 Gantt. 2003 . All Rights Reserved. 223 George E. 60 I Imagineering 31 J JCAHO 128 Jefferson.P. 16 Generally Accepted Accounting Principles 57 General Electric 110. William 74 Hunter. Stephen Jay 177 Guinness 32 H Harrison. Box 21 Gosset. Armand V. 19. Charter Harrison 179 GAAP 55 Galileo 223 Galton. 31. Darrell 52 Hunter. Boyles. Charter 179 Hidden Factory 218 hidden factory 108. 194 fields 59 fingerprints 178 Fisher. 111 Hill. 178 Galvin.

Isaac 50 New Management Equation 63. 2003 . Daniel Sloan and Russell A. Charles 56 main effect 39 Marconi 43 math phobia 220 Matreshka 106 Maxwell. 81. 153 Michelangelo 43 Minitab 88 Motorola 223 multiplication 15 N NASA 194 Netter. 161.258 Index JMP 131 Joint Commission 128 K Kaizen-blitz 145 Keats. 175 Normal distribution 70 n dimensions 31 O OFAT 103 © M. 158 lean 108 Length Of Stay 130 M Mackay. Boyles. 66. All Rights Reserved. 39 L law of the universe 9. John 50 knowledge 36. 19. James Clerk 91 measurements 9. Frank 33 Newton.

209 Simulation 100 SIPOC 155 Sisyphus 180 Six Sigma 18. 2003 . Thomas Paper Bags 14 Pareto chart 127. 164 Shewhart. Boyles. Pablo 37 predicted values 11. Carl 162 sample size 59 sample standard deviation 64 scientific management 13 Sculpey Clay 10. 140. Daniel Sloan and Russell A. All Rights Reserved. Paine. 89 Six Sigma theory 19 © M. Walter A. 12 Rothamsted 32 Russian dolls 106 S Sagan. 44. 165 process capability 113 process maps 94 Profit Signals 44 Pythagoras 21 Pythagorean Theorem 13 Q quarterly review 207 R reasoning 81 records 123 refraction 50 regression modeling 177 Ronald Fisher 9. 171 perpendicular planes 41 PERT 97 Picasso.Index 259 P P-value 81 p-value 72.

Bill 18 spreadsheet 80 spreadsheet analysis 14 standards of evidence 20. All Rights Reserved. James 47 Twain. 152. 103 tetrahedron 10. Daniel Sloan and Russell A. Boyles. 13. 171. 158. 2003 . 208 © M. Mark 59 V variation 10 vector 10. Frederick W. 152 vector analysis 9. 76. 84.260 Index Six Sigma tools 19 Smith. 160 Stories 50 straight-line prediction 181 straw man 76 strength of evidence 83 T Taylor. 21. 70. 60. 165 Themis 83 Three Rs 23 Transparency 13 Turrell. 10. 49.

Are you sure?

This action might not be possible to undo. Are you sure you want to continue?

We've moved you to where you read on your other device.

Get the full title to continue

Get the full title to continue listening from where you left off, or restart the preview.

scribd