You are on page 1of 8

Predictable

Quality is related to processes. A process is “a series of actions or steps taken


in order to achieve a particular end.” It doesn’t matter whether the process is
the handling of invoices, customers in a bank, the manufacture or assembly of
parts, insurance claims, the sick passing through a hospital, or any one of
thousands of other businesses. A process involves movement and action in a
sequential fashion.

Every quality professional is concerned about the improvement of processes.


By making processes better, we get less waste, lower costs and happier
customers.

The image above depicts two opposed states … a dynamic, changing state and
a static state. The lake is static, unchanging. We might take temperature
measurements in the lake at different depths and come back tomorrow to find
no difference. We might take samples of the lake water to compare with other
lakes at a later date when we travelled to them.

By contrast, the stream is dynamic. It changes second to second. It is a


combination of a myriad of chaotic processes that would take a myriad of
Navier-Stokes equations to solve … that is, once the Millenium prize had been
won, in order to show how to solve a Navier-Stokes equation. Measure the
flow rate in different parts of the stream, and you would not be surprised to
find constant changes.

The stream represents the changing and dynamic businesses with we are all
familiar and are all concerned with improving. Professor Deming referred to
the methods to study dynamic systems as “analytic”. He referred to the lake
as being studied using “enumerative” methods.

Researchers in lake compositions, psychologists, demographers all use


enumerative methods. The first use of such methods was carried out by John
Arbuthnot who published the first statistical test in 1710. Pierre-Simon,
marquis de Laplace, pioneer of the Laplace transform, in 1812 issued his
Théorie analytique des probabilities which laid down many of the fundamentals
of statistics. Such statistics are based on the normal distribution, derived by
Gauss in 1809. Over the ensuing centuries, statistical tests, or hypothesis tests,
were devised by men such as Wilson, Box, Cox, Mood, Mann, Whitney, Kruskal,
Wallis, Friedman. There’s lots of wonderful and interesting statistics. They are
useless for process improvement.

Whilst enumerative methods proved powerful they were completely


completely unsuitable for dynamic, analytic situations. In 1944 Dr Shewhart
made the brilliant observation that:
“Classical statistics start with the assumption that a statistical universe exists,
whereas [SPC] starts with the assumption that a statistical universe does not
exist.”

Dr Shewhart’s discovery was that classical, enumerative statistics were


inappropriate for process improvement.

This led professor Deming to state in 1986:

"The student should avoid passages in books that treat confidence intervals
and tests of significance, as such calculations have no application in analytic
problems in science and industry." "Analysis of variance, t- test, confidence
intervals, and other statistical techniques taught in the books, however
interesting, are inappropriate because they bury the information contained in
the order of production." "... a confidence interval has no operational meaning
for prediction, hence provides no degree of belief in planning."

In the 30 years after Professor Deming’s statements, the message still hasn’t
sunk in for most folk. Six Sigma had farcical foundations based on the claim
that all processes drift +/- 1.5 sigma in 24 hours, based on the height of a stack
of disks (“Benderizing”). What followed was worse. Six Sigma was concocted
by a psychologist who said “I am not an engineer. I had no idea what Mr Bill
Smith was talking about [Smith’s casting process]”. He filled Six Sigma with
what he knew – enumerative methods. Consequently, 90% of the typical Six
Sigma course content today, has absolutely nothing to do with process
improvement! Six Sigma focuses on exactly what Professor Deming warned
against! That is, static, enumerative methods. It is hardly surprising that Dr
Wheeler, the world’s leading process statistician, calls Six Sigma “goofy”. CBS
calls it the most stupid fad of all time. “Of the 58 large companies announced
Six Sigma programs, 91 percent have trailed the S&P 500 since.” A survey by
Minitab showed that 80% of Six Sigma projects fail (of those brave enough to
admit failure).
Professor Deming’s key word for process improvement is prediction.
Businesses want to be able to be sure not only that their processes will be
improved, but that they will stay that way into the future. Dr Shewhart
derived his control charts, with statistical knowledge, but based on economics.
His charts indicate when a process is predictable and when special causes that
disrupt stability, are likely to exist, and should be investigated. Most
importantly, a control chart is not a probability chart. It does not give
probabilities of a process being predictable or otherwise. It does not depend
on any particular data distribution in the way that enumerative methods do.

Have a look at the data distributions below. It is difficult to imagine just what
processes might produce them, but which of the processes below do you think
could be charted with a control chart, without any data manipulation of any
kind? That is, which can be control charted, without pressing a button on
ridiculously expensive statistical software, to torture the data to make it
confess?
Reproduced with permission of Dr Wheeler.

The answer is all of them. It doesn’t matter what the histogram or the
distribution looks like. Data from any distribution can be charted.
Furthermore, simple XbarR charts manually drawn, produce just as good
results, as those from folk selling software to draw XbarS charts.

We have a system that can demonstrate the power of analytic methods and
control charts to students. Our spectacular new Q-Skills3D is the world’s first
full interactive 3D training product in quality. It has been built using what is
known as a game engine. This engine simulates real world physics and
behaviours. We can investigate and wonder upon, dynamic systems from our
armchair.

One such dynamic simulation is a real world, historical story of process


improvement, turned into a game. This game is used in both histogram and
control chart training, as well as explaining the meaning of “World Class
Quality” … which has nothing to do with the number of hits as shown in the
image below.

The Q-Skills3D ship game simulates the situation over a century ago, on a
heaving sea, with a yawing, pitching, rolling vessel, shooting at another. As you
can imagine, the hit rate was terrible. Now, fortunately, Q-Skills3D can be
used both as self paced training and in the classroom. For the benefit of the
latter, data is generated from playing the shooting game. The data for a very
experienced gunner (guess who) is plotted on the left below. I might have
drawn it Shewhart style but I was lazy and used a spreadsheet. As you might
expect, the results are excellent and in control. We can depend on the gunner
to produce consistent shooting in the future.
Now suppose we take the data and swap 3 pairs of values. Draw a control
chart. It is using exactly the same data but with a different sequence. We get
the result on the right, below. You might imagine this result as coming from a
second gunner. He might appear to be a better gunner but his control chart
shows an out of control point. We cannot depend on his shooting. He is
unpredictable.

Remember that the data for these two gunners is identical … except for the
order. Any enumerative test would not have been able to distinguish between
the two performances. It is also important to look at the histogram for the
data. The histograms are also identical and the data is clearly non
symmetrical. It is not normally distributed data. Enumerative methods cannot
be used for non normal data but skewed data such, as this, is fine for control
charting.
It is also important to note that you should never draw a normal distribution
over a histogram. Not only does it provide no benefit and is utterly
meaningless, but it can hide the voice of the process speaking in the histogram.
In this case, we can see that the gunner seems more likely to overshoot than
undershoot. We might decide to investigate the causes.

While we can eliminate special causes, such as wild shooting, a process


improvement requires a system change. This is what a very clever man did 120
years ago. Q-Skills3D gives an interactive 3D simulation of this brilliant process
improvement that was devised. It achieved dramatically better quality. The
student plays the game again with the process improvement and control limits
are adjusted. However in a fascinating but sad tale, told in the Q-Skills3D
training, it took decades for the new method to be accepted.

It seems incredible that it took so long for such a simple yet brilliant idea to be
adopted in the Navy. Yet how incredible is it that Dr Shewhart and Professor
Deming’s simple yet brilliant control charts still have such poor understanding,
over 30 years to be understood. Why has it taken so long for people to
understand that processes need analytic methods, not enumerative ones.

You might also like