# 7 Basic Qualty Tools & Root Cause Analysis

PDF generated using the open source mwlib toolkit. See http://code.pediapress.com/ for more information. PDF generated at: Thu, 27 Oct 2011 17:41:38 UTC

Contents
Articles
Seven Basic Tools of Quality Ishikawa diagram Check sheet Control chart Histogram Pareto chart Scatter plot Stratified sampling Root cause analysis 5 Whys Why–because analysis Eight Disciplines Problem Solving 1 2 6 7 15 21 23 25 27 31 33 35

References
Article Sources and Contributors Image Sources, Licenses and Contributors 38 39

Seven Basic Tools of Quality

1

Seven Basic Tools of Quality
The Seven Basic Tools of Quality is a designation given to a fixed set of graphical techniques identified as being most helpful in troubleshooting issues related to quality.[1] They are called basic because they are suitable for people with little formal training in statistics and because they can be used to solve the vast majority of quality-related issues.[2] :198 The tools are:[3] • • • • • • • The cause-and-effect or Ishikawa diagram The check sheet The control chart The histogram The Pareto chart The scatter diagram Stratification (alternately flow chart or run chart)

The designation arose in postwar Japan, inspired by the seven famous weapons of Benkei.[4] At that time, companies that had set about training their workforces in statistical quality control found that the complexity of the subject intimidated the vast majority of their workers and scaled back training to focus primarily on simpler methods which suffice for most quality-related issues anyway.[2] :18 The Seven Basic Tools stand in contrast with more advanced statistical methods such as survey sampling, acceptance sampling, statistical hypothesis testing, design of experiments, multivariate analysis, and various methods developed in the field of operations research.[2] :199

References
[1] Montgomery, Douglas (2005). Introduction to Statistical Quality Control (http:/ / www. eas. asu. edu/ ~masmlab/ montgomery/ ). Hoboken, New Jersey: John Wiley & Sons, Inc.. pp. 148. ISBN 9780471656319. OCLC 56729567. . [2] Ishikawa, Kaoru (1985), What Is Total Quality Control? The Japanese Way (1 ed.), Englewood Cliffs, New Jersey: Prentice-Hall, ISBN 9780139524332, OCLC 11467749 [3] Nancy R. Tague (2004). "Seven Basic Quality Tools" (http:/ / www. asq. org/ learn-about-quality/ seven-basic-quality-tools/ overview/ overview. html). The Quality Toolbox. Milwaukee, Wisconsin: American Society for Quality. p. 15. . Retrieved 2010-02-05. [4] Ishikawa, Kaoru (1990), Introduction to Quality Control (1 ed.), Tokyo: 3A Corp, p. 98, ISBN 9784906224616, OCLC 23372992

Ishikawa diagram

2

Ishikawa diagram
Ishikawa diagram

One of the Seven Basic Tools of Quality
First described by Kaoru Ishikawa Purpose To break down (in successive layers of detail) root causes that potentially contribute to a particular effect

Ishikawa diagrams (also called fishbone diagrams, or herringbone diagrams , cause-and-effect diagrams, or Fishikawa) are causal diagrams that show the causes of a certain event -- created by Kaoru Ishikawa (1990).[1] Common uses of the Ishikawa diagram are product design and quality defect prevention, to identify potential factors causing an overall effect. Each cause or reason for imperfection is a source of variation. Causes are usually grouped into major categories to identify these sources of variation. The categories typically include: • People: Anyone involved with the process • Methods: How the process is performed and the specific requirements for doing it, such as policies, procedures, rules, regulations and laws • Machines: Any equipment, computers, tools etc. required to accomplish the job • Materials: Raw materials, parts, pens, paper, etc. used to produce the final product • Measurements: Data generated from the process that are used to evaluate its quality • Environment: The conditions, such as location, time, temperature, and culture in which the process operates

Overview
Ishikawa diagrams were proposed by Kaoru Ishikawa[2] in the 1960s, who pioneered quality management processes in the Kawasaki shipyards, and in the process became one of the founding fathers of modern management. It was first used in the 1940s, and is considered one of the seven basic tools of quality control.[3] It is known as a fishbone diagram because of its shape, similar to the side view of a fish skeleton. Mazda Motors famously used an Ishikawa diagram in the development of the Miata sports car, where the required result was "Jinba Ittai" or

Ishikawa diagram, in fishbone shape, showing factors of Equipment, Process, People, Materials, Environment and Management, all affecting the overall problem. Smaller arrows connect the sub-causes to major causes.

) Man Power (physical work)/Mind Power (brain work): Kaizens. power The 4 Ss (used in service industry) • • • • Surroundings Suppliers Systems Skills Questions to be asked while building a Fishbone Diagram Man/Operator – Was the document properly interpreted? – Was the information properly circulated to all the functions? – Did the recipient understand the information? – Was the proper training to perform the task administered to the person? – Was too much judgment required to perform the task? – Were guidelines for judgment available? – Did the environment influence the actions of the individual? – Are there distractions in the workplace? . Consumables and Information. The main causes included such aspects as "touch" and "braking" with the lesser causes including highly granular factors such as "50/50 weight distribution" and "able to rest elbow on top of driver's door".Ishikawa diagram "Horse and Rider as One". They will typically be one of the traditional categories mentioned above but may be something unique to the application in a specific case. Typical categories are: The 8 Ms (used in manufacturing) • Machine (technology) • Method (process) • • • • • • Material (Includes Raw Material. Causes can be derived from brainstorming sessions. and the possible causes provide additional insight into process behavior. such as to the 8 M's. Suggestions Measurement (Inspection) Milieu/Mother Nature (Environment) Management/Money Power Maintenance The 8 Ps (used in service industry) • • • • • • • • Product=Service Price Place Promotion/Entertainment People(key person) Process Physical Evidence Productivity & Quality 1. These groups can then be labeled as categories of the fishbone. Cause-and-effect diagrams can reveal key relationships among various variables. Causes can be traced back to root causes with the 5 Whys technique. 3 Causes Causes in the diagram are often categorized. Every factor identified in the diagram was included in the final design. described below.

A sufficient condition for the occurrence of an event is a circumstance in whose presence the event must occur. Loftus). 42. (Translator: J. Infusion Therapy in Clinical Practice.com. ISBN 4-906224-61-X OCLC 61341428 • Dale. 448 p. 15. Loftus). . p. vol. Introduction to Quality Control. (2007). Milwaukee. Kaoru (1990). ISBN 4-906224-61-X OCLC 61341428 [2] Hankins. A necessary condition for the occurrence of a specified event is a circumstance in whose absence the event cannot occur.Ishikawa diagram 5 Criticism In a discussion of the nature of a cause it is customary to distinguish between necessary and sufficient conditions for the occurrence of an event. Judy (2001). Managing Quality 5th ed. 44 (4). H. Some critics failing this simple logic have asked which conditions (necessary or sufficient) are addressed by the diagram in case[5] References [1] Ishikawa. [3] Nancy R. et al. 42 (ISSN 0265-5976). The Quality Toolbox. Introduction to Quality Control. pp. (Translator: J.htm) . Ishikawa diagrams are meant to use the necessary conditions and split the "sufficient" ones into the "necessary" parts.hci. Tague (2004). H.au/hcisite5/library/materials/ Cause and effect diagrams. "Seven Basic Quality Tools" (http:/ / www.322 [5] Gregory. (1968) Introduction to Logic. [4] Copi. Further reading • Ishikawa. html). pp 333-344. ISBN 978-1-4051-4279-3 OCLC 288977828 External links • Article from HCI Australia on Cause and Effect Diagrams (http://www. New York. Kaoru (1990). Retrieved 2010-02-05. Effect. Warwick Business School Research Paper No. Irving M. org/ learn-about-quality/ seven-basic-quality-tools/ overview/ overview. 448 p.[4] A sufficient condition naturally contains one or several necessary ones. Wisconsin: American Society for Quality. Frank Hutson (1992) Cause. later published in Journal of the Operational Research Society. Barrie G. asq. p. Efficiency & Soft Systems Models. Third Edition. Macmillian.

The document is typically a blank form that is designed for the quick. • Check List: The items to be performed for a task are listed so that. wiley. • Measurement Scale: A measurement scale is divided into intervals. 15. 1002/ pfi. Retrieved 2010-02-05. com/ doi/ 10. Schultz (2006). asq. p. easy. When the information is quantitative. 2006. as each is accomplished. Retrieved 2011-10-06. International Society for Performance Improvement. Tague (2004).[2] References [1] John R. The check sheet is one of the seven basic tools of quality control. p. 4930450405/ abstract). and efficient recording of the desired information. Milwaukee. Data is read by observing the location and number of marks on the sheet. the checksheet is sometimes called a tally sheet[1] . • Location: The physical location of a trait is indicated on a picture of a part or item being evaluated. org/ learn-about-quality/ seven-basic-quality-tools/ overview/ overview. . 5 Basic types of Check Sheets: • Classification: A trait such as a defect or failure mode must be classified into a category. The Quality Toolbox. . A typical checksheet is divided into regions. Also number of occurrences of a trait on a part can be indicated. Wisconsin: American Society for Quality. and marks made in different regions have different significance. "Seven Basic Quality Tools" (http:/ / www. and measurements are indicated by checking an appropriate interval. A defining characteristic of a checksheet is that data is recorded by making marks ("checks") on it. it can be indicated as having been completed. which can be either quantitative or qualitative. An example of a simple quality control checksheet [2] Nancy R.Check sheet 6 Check sheet Check sheet One of the Seven Basic Tools of Quality Purpose To provide a structured way to collect quality-related data as a rough means for assessing a process or as an input to other analyses The check sheet is a simple document that is used for collecting data in real-time and at the location where the data is generated. "Measuring Service Industry Performance: Some Basic Concepts" (http:/ / onlinelibrary. 3. . html). • Frequency: The presence or absence of a trait or combination of traits is indicated.

range. drawn as separate lines. proportion) of measurements of a quality characteristic in samples taken from the process at different times [the data] • The mean of this statistic using all the samples is calculated (e. some processes display controlled variation that is natural to the process.[5] In 1924 or 1925. standard deviation/sqrt(n) for the mean) of the statistic is also calculated using all the samples • Upper and lower control limits (sometimes called "natural process limits") that indicate the threshold at which the process output is considered statistically 'unlikely' are drawn typically at 3 standard errors from the center line The chart may have other optional features. then working at the Hawthorne facility. and the use of the control chart. That diagram. His ensuing involvement in Japanese life. and the short text which preceded and followed it. as determined by the Quality Engineer in charge of the process's quality . Over the next half a century. While Dr. typically two standard errors above and below the center line • Division into zones. 8 Chart details A control chart consists of: • Points representing a statistic (e. Dr. he understood data from physical processes typically produce a "normal distribution curve" (a Gaussian distribution. set forth all of the essential principles and considerations which are involved in what we know today as process quality control. Deming became the foremost champion and proponent of Shewhart's work.. and keeping it in control.g.. After the defeat of Japan at the close of World War II. widely in Japanese manufacturing industry throughout the 1950s and 1960s.Control chart schematic control chart. mean of the ranges. a mean. where there is only common-cause variation. Shewhart concluded that while every process displays variation.. Deming later worked at the United States Department of Agriculture and then became the mathematical advisor to the United States Census Bureau.g. including: • Upper and lower warning limits."[4] Shewhart stressed that bringing a production process into a state of statistical control. also commonly referred to as a "bell curve"). the mean of the means. Shewhart's innovation came to the attention of W.g. spread Shewhart's thinking. He discovered that observed variation in manufacturing data did not always behave the same way as data in nature (Brownian motion of particles). Dr. Shewhart created the basis for the control chart and the concept of a state of statistical control by carefully designed experiments. Deming served as statistical consultant to the Supreme Commander of the Allied Powers. while others display uncontrolled variation that is not present in the process causal system at all times. Shewhart drew from pure mathematical statistical theories. is necessary to predict future output and to manage a process economically. Edwards Deming. and long career as an industrial consultant there. mean of the proportions) • A center line is drawn at the value of the mean of the statistic • The standard error (e. with the addition of rules governing frequencies of observations in each zone • Annotation with events of interest.

a control chart "signaling" the presence of a special-cause requires immediate investigation. Control charts limit specification limits or targets because of the tendency of those involved with the process (e. where the change is bad then its cause should be identified and eliminated. the process mean (and hence the center line) may not coincide with the specified value (or target) of the quality characteristic because the process' design simply cannot deliver the process characteristic at the desired level. The purpose of control charts is to allow simple detection of events that are indicative of actual process change. The control limits tell you about process behavior and have no intrinsic relationship to any specification targets or engineering tolerance. the means of sufficiently large samples drawn from practically any underlying distribution whose variance exists are normally distributed. Any observations outside the limits. the Quality Engineer may temporarily increase the rate at which samples are taken from the process output until it's clear that the process is truly in control. In practice. known as a special-cause variation. the control chart provides statistically objective criteria of change. machine operators) to focus on performing to specification when in fact the least-cost course of action is to keep process variation as low as possible. (For example.4) points for normally-distributed processes. This makes the control limits very important decision aids. Process capability studies do examine the relationship between the natural process limits (the control limits) and specifications. Since increased variation means increased quality costs.. Instead of immediately launching a process improvement effort to determine whether special causes are present. common-cause variations result in signals less than once out of every twenty-two points for skewed processes and about once out of every three hundred seventy (1/370. according to the Central Limit Theorem.Control chart 9 Chart usage If the process is in control (and the process statistic is normal). Note that with three-sigma limits.[6] The two-sigma warning levels will be reached about once for every twenty-two (1/21. Attempting to make a process whose natural center is not the same as the target perform to target specification increases process variability and increases costs significantly and is the cause of much inefficiency in operations. suggest the introduction of a new (and likely unanticipated) source of variation.98) plotted points in normally-distributed data. When change is detected and considered good its cause should be identified and possibly become the new way of working. or systematic patterns within.g. 99. The purpose in adding warning limits or subdividing the control chart into zones is to provide early notification if something is amiss.) . however.7300% of all the points will fall between the control limits. This simple decision can be difficult where the process characteristic is continuously varying.

As the practical engineer might say. the probability of an outcome greater than k standard deviations from the mean is at most 4/(9k2). that for any unimodal probability distribution. then generalized functional forms were tried.under a wide range of unknowable circumstances. from the two errors: 1. (Also known as a Type I error) 2. in terms of sample variance. Hence. An alternative method is to use the relationship between the range of a sample and its standard deviation derived by Leonard H. Tippett. the fact that the criterion which we happen to use has a fine ancestry in highbrow statistical theorems does not justify its use... the probability of an outcome greater than k standard deviations from the mean is at most 1/k2.. Deming's intention was to seek insights into the cause system of a process . the proof of the pudding is in the eating. Ascribe a variation or a mistake to the system (common causes) when in fact the cause was a special cause (assignable cause). Though he initially experimented with limits based on probability distributions.Control chart 10 Choice of limits Shewhart set 3-sigma (3-standard error) limits on the following basis. The control chart is intended as a heuristic... When the normal law was found to be inadequate.. Shewhart summarized the conclusions by saying: . He contended that the disjoint nature of population and sampling frame in most industrial situations compromised the use of conventional statistical techniques. Today. under such conditions. the standard deviation (error) required is that of the common-cause variation in the process. . an estimator which tends to be less influenced by the extreme observations which typify special-causes. Such justification must come from empirical evidence that it works. Deming insisted that it is not a hypothesis test and is not motivated by the Neyman-Pearson lemma.. • The coarse result of Chebyshev's inequality that.. is not used as this estimates the total squared-error loss from both common. future and past. • The finer result of the Vysochanskii-Petunin inequality.and special-causes of variation. the usual estimator. a rational and economic guide to minimum economic loss. Shewhart ultimately wrote: Some of the earliest attempts to characterize a state of statistical control were inspired by the belief that there existed a special form of frequency function f and it was early argued that the normal law characterized such a state.. • The empirical investigation of sundry probability distributions reveals that at least 99% of observations occurred within three standard deviations of the mean. however. C. all hopes of finding a unique functional form f are blasted. (Also known as a Type II error) Calculation of standard deviation As for the calculation of control limits. for any probability distribution. He claimed that. Ascribe a variation or a mistake to a special cause (assignable cause) when in fact the cause belongs to the system (common cause).. 3-sigma limits provided ..

no special causes are present in the system). even an in control process plotted on a properly constructed control chart will eventually signal the possible presence of a special cause. 7. Other types of control charts have been developed. it may not be of sufficient magnitude for the chart to produce an immediate alarm condition. those responsible for the underlying process are expected to determine whether a special cause has occurred.Control chart 11 Rules for detecting signals The most common sets are: • The Western Electric rules • The Wheeler rules (equivalent to the Western Electric zone tests[7] ) • The Nelson rules There has been particular controversy as to how long a run of observations. the British Standards Institution. Performance of control charts When a point falls outside of the limits established for a given control chart. as their out-of-control ARLs are fairly short in these cases. the in-control average run length (or in-control ARL) of a Shewhart chart is 370. e.4. Meanwhile.4 observations. high-dimensional. However. 8 and 9 all being advocated by various writers.g. When those changes are quantified. the Shewhart chart does not detect these changes efficiently. This move continues to be represented by John Oakland and others but has been widely deprecated by writers in the Shewhart-Deming tradition. mix numerical and categorical. So. with 6. all on the same side of the centre line. which detect smaller changes more efficiently by making use of information from observations collected prior to the most recent data point. it may be appropriate to intentionally retain the special cause within the system producing the results. there is approximately a 0. the CUSUM chart and the real-time contrasts chart. it is possible to determine the out-of-control ARL for the chart. non-Gaussian. For a Shewhart control chart using 3-sigma limits.0027 or 370. Most control charts work best for numeric data with Gaussian assumptions. even though one may not have actually occurred. If a special cause occurs. If one has. If worse. it is appropriate to determine if the results with the special cause are better than or worse than results from common causes alone. should count as a signal. If better. then that cause should be eliminated if possible. The most important principle for choosing a set of rules is that the choice be made before the data is inspected. . one can describe that cause by measuring the change in the mean and/or variance of the process in question. The real-time contrasts chart was proposed able to handle process data with complex characteristics. Alternative bases In 1935. Choosing rules once the data have been seen tends to increase the Type I error rate owing to testing effects suggested by the data.or 2-sigma change in the mean). for smaller changes (such as a 1. replacing 3-sigma limits with limits based on percentiles of the normal distribution. Therefore.27% probability of a point exceeding 3-sigma control limits. such as the EWMA chart. under the influence of Egon Pearson and against Shewhart's spirit. adopted control charts. It turns out that Shewhart charts are quite good at detecting large changes in the process mean or variance. non-linear relationship. this false alarm occurs on average once every 1/0. missing-valued. if a special cause does occur. It is known that even when a process is in control (that is.

Control chart 12 Criticisms Several authors have criticised the control chart on the grounds that it violates the likelihood principle. e. Some authors have criticised the use of average run lengths (ARLs) for comparing control chart performance.[10] Second. the principle is itself controversial and supporters of control charts further argue that.5σ) Large (≥ 1. normality is not necessary for statistical control. Some authors have criticized that most control charts focus on numeric data. which has high variability and difficulties.5σ) Large (≥ 1.5σ) Small (< 1. particularly when the assumptions of either binomially-distributed data (p. it is impossible to specify a likelihood function for a process not in statistical control.g.5σ) Large (≥ 1.and np-charts) or Poisson-distributed data (u. so the Individuals chart may be used with non-normal data.5σ) N/A p-chart Independent Attributes† Attributes† Attributes† Attributes† Attributes or variables Attributes or variables Attributes or variables Variables np-chart Number nonconforming within one subgroup Independent c-chart Number of nonconformances within one subgroup Independent u-chart Nonconformances per unit within one subgroup Independent EWMA chart Exponentially weighted moving average of quality characteristic measurement within one subgroup Cumulative sum of quality characteristic measurement within one subgroup Quality characteristic measurement within one subgroup Quality characteristic measurement within one subgroup Sliding window of quality characteristic measurement within one subgroup Independent CUSUM chart Independent Time series model Autocorrelated Regression control chart Dependent of process control variables Independent Large (≥ 1. non-Gaussian. while Individuals charts derive .[8] Types of charts Chart Process observation Process observations relationships Independent Process observations type Variables Size of shift to detect Large (≥ 1.5σ) Real-time contrasts chart Attributes or variables † Some practitioners also recommend the use of Individuals charts for attribute data. because that average usually follows a geometric distribution. missing-valued.5σ) Large (≥ 1. especially where knowledge about the cause system of the process is weak. attribute charts derive the measure of dispersion directly from the mean proportion (by assuming a probability distribution).5σ) Small (< 1.5σ) Large (≥ 1.5σ) and R chart Quality characteristic measurement within one subgroup Quality characteristic measurement within one subgroup Quality characteristic measurement for one observation and s chart Independent Variables Shewhart individuals control chart (ImR chart or XmR chart) Three-way chart Independent Variables† Quality characteristic measurement within one subgroup Fraction nonconforming within one subgroup Independent Variables Large (≥ 1. However. First. Nowadays. in general.5σ) Large (≥ 1.and c-charts) are violated.[9] Two primary justifications are given for this practice. mix numerical and categorical.5σ) Small (< 1. process data can be much more complex.

Knoxville. SPC Press. David S.. G & Tuv. Knoxville. Milwaukee. making Individuals charts more robust than attributes charts to violations of the assumptions about the distribution of the underlying population. edu/ hdeng3/ RealtimeJQT2011. [3] Nancy R. Donald J. pp 1–9. . ISBN 9780945320135. [9] Wheeler. Journal of Quality Technology (forthcoming). [11] Wheeler. [4] Western Electric . Retrieved 7 December 2010. Understanding statistical process control (2 ed. application of the charts in the presence of such deviations increases the type I and type II error rates of the control charts. BPI Consulting. . asu. ISBN 0-945320-53-1.. Understanding Variation: the key to managing chaos. "Seven Basic Quality Tools" (http:/ / www. com/ jul/ spctool. asq. Rip (1 Apr 2010). The Quality Toolbox. Retrieved 2010-02-05. pdf). • Deng." The American Statistician. (1999).e. William (July 2006). . D J & Chambers. forthcoming. 1 (1). • Mandel. "System monitoring with real-time contrasts" Journal of Quality Technology. doc) [5] "Why SPC?" British Deming Association SPC Press. [7] Wheeler. Donald J. when the binomial and Poisson distributions approximate a normal distribution. p. D S (1992) Understanding Statistical Process Control ISBN 0-945320-13-2. com/ inside/ quality-insider-article/ some-problems-attribute-charts.. (2000). Donald J. • Oakland. 1992 [6] Wheeler. W A (1939) Statistical Method from the Viewpoint of Quality Control ISBN 0-486-65232-7. i. ISBN 0-945320-53-1. . "System monitoring with real-time contrasts" (http:/ / enpub.). Eugene (2011). independent of the mean. such as when process data is neither normally distributed nor binomially (or Poisson) distributed. Bibliography • Deming. Donald J. • Shewhart.[11] It is sometimes noted that the substitution of the Individuals chart works best for large counts. p. 29(4). Runger. • Wheeler. Quality Digest. Donald J. "Some Problems with Attribute Charts" (http:/ / www. Quality Digest. Inc. spcforexcel. org/ learn-about-quality/ seven-basic-quality-tools/ overview/ overview. W A (1931) Economic Control of Quality of Manufactured Product ISBN 0-87389-076-0. 15. (1992). when the number of trials n > 1000 for p. Such processes are not in control and should be improved before the application of control charts. Additionally. D J (2000) Normality and the Process-Behaviour Chart ISBN 0-945320-56-6. Tennessee: SPC Press. • Wheeler. (1). Donald J. B J (1969). com/ inside/ quality-insider-column/ are-you-sure-we-don-t-need-normally-distributed-data. Tuv. com/ overcontrolling-process-funnel-experiment). qualitydigest.. Productivity and Competitive Position ISBN 0-521-30553-5.and np-charts or λ > 500 for u. html). . H. W E (1975) "On probability as a basis for action.Control chart the measure of dispersion from the data. Retrieved 2010-03-17. & Runger.and c-charts. (2000).2nd Edition. [2] Wheeler. org/ bell/ doc/ western_electric. • Shewhart. Understanding Variation. • Wheeler. E. . [10] Staufer. LLC. Quality Digest. Understanding Variation: The Key to Managing Chaos . Chambers. G. Wisconsin: American Society for Quality. qualitydigest.A Brief History (http:/ / www. Retrieved 2010-03-23. W E (1982) Out of the Crisis: Quality. Retrieved 2 Apr 2010. SPC Press. ISBN 0945320531. Inc. pp146–152 • Deming. html). fulton. qualitydigest. J (2002) Statistical Process Control ISBN 0-7506-5766-9. porticus. 140. 13 Notes [1] McNeese. html). p. (2011). "Over-controlling a Process: The Funnel Experiment" (http:/ / www. html). and may make the chart of little practical use. Tague (2004). OCLC 27187772 [8] Deng. . "Are You Sure We Don't Need Normally Distributed Data?" (http:/ / www. "The Regression Control Chart" Journal of Quality Technology. "What About Charts for Count Data?" (http:/ / www. H. 96. Critics of this approach argue that control charts should not be used then their underlying assumptions are violated. Tennessee: SPC Press.

erected over discrete intervals (bins). non-overlapping intervals of a variable. The categories are usually specified as consecutive. The categories (intervals) must be adjacent. A histogram may also be normalized displaying relative frequencies.[1] A histogram consists of tabular frequencies. and often are chosen to be of the same size.. It then shows the proportion of cases that fall into each of several categories. This will construct a smooth probability density function. then a histogram is identical to a relative frequency plot. The height of a rectangle is also equal to the frequency density of the interval. which uses a kernel to smooth samples.[2] Histograms are used to plot density of data. with the total area equaling 1.[3] . If the length of the intervals on the x-axis are all 1.e. a histogram is a graphical representation showing a visual impression of the distribution of data. shown as adjacent rectangles. which will in general more accurately reflect the underlying variable. The total area of a histogram used for probability density is always normalized to 1. with an area equal to the frequency of the observations in the interval. the frequency divided by the width of the interval.Histogram 15 Histogram Histogram One of the Seven Basic Tools of Quality First described by Karl Pearson Purpose To roughly assess the probability distribution of a given variable by depicting the frequencies of observations occurring in certain ranges of values In statistics. The total area of the histogram is equal to the number of data. and often for density estimation: estimating the probability density function of the underlying variable. i. The histogram is one of the seven basic tools of quality control. It is an estimate of the probability distribution of a continuous variable and was first introduced by Karl Pearson. An alternative to the histogram is kernel density estimation.

S. Area under the curve equals the total number of cases.[5] Using their data on the time occupied by travel to work. Table 2 below shows the absolute number of people An example histogram of the heights of 31 Black who responded with travel times "at least 15 but less than 20 minutes" Cherry trees. Data by absolute numbers Interval 0 5 10 15 20 25 30 35 Width 5 5 5 5 5 5 5 5 Quantity 4180 13687 18618 19634 17981 7190 16369 3212 Quantity/width 836 2737 3723 3926 3596 1438 3273 642 . who introduced the term in 1895. Sometimes it is said to be derived from the Greek histos 'anything set upright' (as the masts of a ship. The problem of reporting values as somewhat arbitrarily rounded numbers is a common phenomenon when collecting data from people. is higher than the numbers for the categories above and below it. Histogram of travel time. This diagram uses Q/width from the table. and gramma 'drawing.[4] Examples The U. This is likely due to people rounding their reported journey time. Census Bureau found that there were 124 million people who work outside of their homes. derived the name from "historical diagram". US 2000 census. writing'.Histogram 16 Etymology The etymology of the word histogram is uncertain. or the vertical bars of a histogram). It is also said that Karl Pearson. record. the bar of a loom.

The height of each bar is the decimal percentage of the total that each category represents. and is also known as a unit area histogram.0290 0. Histogram of travel time. with Q in thousands. the decimal equivalent of 100%.0017 0.0067 0. and the total area of all the bars is equal to 1. The area under the curve represents the total number of cases (124 million).0116 0.Histogram 40 45 60 90 5 15 30 60 4122 9200 6461 3435 824 613 215 57 17 This histogram shows the number of cases per unit interval so that the height of each bar is equal to the proportion of total people in the survey who fall into that category. .0221 0. US 2000 census. This version shows proportions.0300 0. Data by proportion Interval 0 5 10 15 20 25 30 35 40 45 60 90 Width 5 5 5 5 5 5 5 5 5 15 30 60 Quantity (Q) 4180 13687 18618 19634 17981 7190 16369 3212 4122 9200 6461 3435 Q/total/width 0. The curve displayed is a simple density estimate.0005 This histogram differs from the first only in the vertical scale.0049 0. This type of histogram shows absolute numbers.0066 0.0316 0. Area under the curve equals 1.0052 0. This diagram uses Q/total/width from the table.0264 0.

or negatively skewed. the histogram mi meets the following conditions: An ordinary and a cumulative histogram of the same data. but not two connecting intervals of 10.000 points from a normal distribution with a mean of 0 and a standard deviation of 1. histogram construction [8] and manipulation [9] using Java applets and charts [10].[7] Activities and demonstrations The SOCR resource pages contain a number of hands-on interactive activities demonstrating the concept of a histogram.5 and 22.5. The intervals are placed together in order to show that the data represented by the histogram. According to the values presented.Histogram In other words. For example the histogram of data for IQ scores. Empty intervals are represented as empty and not skipped. the tail extends further to the left. Histograms can be unimodal. Mathematical definition In a more general mathematical sense. a histogram is a function mi that counts the number of observations that fall into each of the disjoint categories (known as bins). A symmetrical shape is also possible.5–33. Thus. although a histogram is never perfectly symmetrical. The data shown is a random sample of 10. in a histogram it is possible to have two connecting intervals of 10. If the histogram is skewed to the left.)[6] 18 Shape or form of a distribution The histogram provides important informations about the shape of a distribution.. The shape of a symmetrical distribution mirrors the skeweness of the left or right tail.5–20. if we let n be the total number of observations and k be the total number of bins. the histogram is either highly or moderately skewed to the left or right.5 and 20. A relatively small number of expensive homes create the skeweness to the right. whereas the graph of a histogram is merely one way to represent a histogram. depending on the dataset. The tail extends further to the right. (E. a histogram represents a frequency distribution by means of rectangles whose widths represent class intervals and whose areas are proportional to the corresponding frequencies. while exclusive. Most of the scores are above 70 percent and only a few low scores occure.g.5–32. bi-modal or multi-modal. . An example for a distribution skewed to the left might be the relative frequency of exam scores.5. is also continuous. An example for a distribution skewed to the right or positively skewed is a histogram showing the relative frequency of housing values.5–20.

however. Square-root choice which takes the square root of the number of data points in the sample (used by Excel histograms and many others). Choice based on minimization of an estimated L2 risk function[15] where and are mean and biased variance of a histogram with bin-width . and different bin sizes can reveal different features of the data. and can perform poorly if n < 30. and .[11] The number of bins k can be assigned directly or can be calculated from a suggested bin width h as: The braces indicate the ceiling function. That is. the cumulative histogram Mi of a histogram mj is defined as: Number of bins and width There is no "best" number of bins. Depending on the actual data distribution and the goals of the analysis. There are. but these methods generally make strong assumptions about the shape of the distribution. denoted by IQR. Scott's choice[13] where is the sample standard deviation.Histogram 19 Cumulative histogram A cumulative histogram is a mapping that counts the cumulative number of observations in all of the bins up to the specified bin. different bin widths may be appropriate. Freedman–Diaconis' choice[14] which is based on the interquartile range. Some theoreticians have attempted to determine an optimal number of bins. . so experimentation is usually needed to determine an appropriate width. various useful guidelines and rules of thumb. Sturges' formula[12] which implicitly bases the bin sizes on the range of the data.

volume 2. org/ learn-about-quality/ seven-basic-quality-tools/ overview/ overview. census. David.jp/shimazaki/res/histogram. (1926). edu/ socr/ index. Skew Variation in Homogeneous Material". html). H. socr.census. org/ content/ m16298/ 1. "On the histogram as a density estimator: L2 theory". H.1098/rsta. Eileen Magnello (December 1856). D. [13] Scott. "Contributions to the Mathematical Theory of Evolution.1162/neco. N. "On optimal and data-based histograms".. (2008) Statistics in Psychology.1503. Retrieved 2010-02-05.mathworks.org/interactivate/activities/histogram/) • Matlab function to plot nice histograms (http://www.jukuin.com/matlabcentral/ fileexchange/30480-histconnect) • Histograms: Construction.ac. "A method for selecting the bin size of a time histogram" (http:/ / www. Venables and B.gov/toolkits/ati/histograms. html [11] e. Biometrika 66 (3): 605–610. The New Zealand Journal for the History and Philosophy of Science and Technology 1 volume. (1895).keio. [5] US 2000 census (http:/ / www. Modern Applied Statistics with S. ISBN 0 471 51250-8 External links • Journey To Work and Place Of Work (http://www. and Cramer. Further reading • Lancaster.19. doi:10. "Statistics for Business and Economics".343P. p.. & Illowsky. 19. Ripley. org/ article010107. § 5. (2009.6. Physical and Engineering Sciences 186: 343–326. [4] M. John Wiley and Sons.1093/biomet/66. 2010. H. S. K.605.186. Retrieved from the Connexions Web site: http:/ / cnx. P. Tague (2004).gov/population/www/socdemo/journey. David R. A. PMID 17444758. (1979). February 19).1007/BF01025868. gov/ prod/ 2004pubs/ c2kbr-33. David W. [14] The Freedman–Diaconis rule isFreedman. ucla.html) (location of census document cited in example) • Smooth histogram for signals and images from a few samples (http://www. ucla. [15] Shimazaki. mitpressjournals. p. Diaconis. 11/ [7] Anderson. (2007). Descriptive Statistics: Histogram. pdf). rutherfordjournal. "Seven Basic Quality Tools" (http:/ / www.fnal. D.O. "The choice of a class interval". II. Zeitschrift für Wahrscheinlichkeitstheorie und verwandte Gebiete 57 (4): 453–476.. S. php/ SOCR_EduMaterials_Activities_PowerTransformFamily_Graphs [10] http:/ / www.com/matlabcentral/fileexchange/ 27388-plot-and-compare-nice-histograms-by-default) . Springer. php/ SOCR_EduMaterials_ModelerActivities_MixtureModel_1 [9] http:/ / wiki. doi:10. Analysis and Understanding with external links and an application to particle Physics. ucla. 32–33. B. [2] Howitt.g. stat. W.2007. doi:10. ISSN 1177–1380. Shinomoto. 1503). Neural Computation 19 (6): 1503–1527. An Introduction to Medical Statistics.shodor. . edu/ htmls/ SOCR_Charts. (http://quarknet. edu/ socr/ index. 6. Bibcode 1895RSPTA. . 1162/ neco. The Quality Toolbox. 1974. Philosophical Transactions of the Royal Society A: Mathematical. html) • Interactive histogram generator (http://www. . Wisconsin: American Society for Quality. 15. [8] http:/ / wiki.1895. Prentice Hall [3] Nancy R. "Karl Pearson and the Origins of Modern Statistics: An Elastician becomes a Statistician" (http:/ / www. doi:10.html) • A Method for Selecting the Bin Size of a Histogram (http://2000. asq. stat.Histogram 20 References [1] Pearson.mathworks.3. J. (1981). American Statistical Association: 65–66.6 "Density Estimation". Milwaukee. D.0010. [6] Dean. org/ doi/ abs/ 10. html). 4th edition [12] Sturges. 2007.

[1] . the cumulative function is a concave function. These charts can be generated by simple spreadsheet programs. total cost. The right vertical axis is the cumulative percentage of the total number of occurrences. and so on. Wilkinson (2006) devised an algorithm for producing statistically-based acceptance limits (similar to confidence intervals) for each bar in the Pareto chart. the highest occurring type of defect. it often represents the most common sources of defects. In of reasons for arriving late at work quality control.Pareto chart 21 Pareto chart Pareto chart One of the Seven Basic Tools of Quality First described by Joseph M. The purpose of the Pareto chart is to highlight the most important among a Simple example of a Pareto chart using hypothetical data showing the relative frequency (typically large) set of factors. but it can alternatively represent cost or another important unit of measure. named after Vilfredo Pareto. where individual values are represented in descending order by bars. To take the example above. in order to lower the amount of late arriving by 80%. such as OpenOffice. is a type of chart that contains both bars and a line graph.org Calc and Microsoft Excel and specialized statistical software tools as well as online quality charts generators. Because the reasons are in decreasing order. or total of the particular unit of measure. or the most frequent reasons for customer complaints. Juran Purpose To assess the most frequently-occurring defects by category† A Pareto chart. The left vertical axis is the frequency of occurrence. it is sufficient to solve the first three issues. The Pareto chart is one of the seven basic tools of quality control. and the cumulative total is represented by the line.

Design and analysis of experiments. C. . K. org/ learn-about-quality/ seven-basic-quality-tools/ overview/ overview.. (1985). • Vaughn. Further reading • Hart. D. M. Milwaukee. Quality planning and analysis. New York: McGraw-Hill. (1989). (1991). L. M. Quantitative methods for quality improvement. New York: Wiley. 3rd ed. Wisconsin: American Society for Quality. • Pyzdek. WI: ASQC Quality Press.. M. "Revising the Pareto Chart". Milwaukee. (2006). New York: McGraw-Hill. Tague (2004). New York: Marcel Dekker. F. C. • Montgomery. Ames. . Quality control handbook. 15. F. IA: Iowa State Press. D. (1962). • Wilkinson. The American Statistician 60: 332–334. J. Statistical quality control. • Montgomery. • Juran. "Seven Basic Quality Tools" (http:/ / www. M. Retrieved 2010-02-05. • Juran.Pareto chart 22 References [1] Nancy R. (1970). R. T. New York: Wiley. (1989). (1974). & Gryna. J. & Hart. html). Quality control. p. asq. The Quality Toolbox. R. What every engineer should know about quality control. C.

scatter diagram or scatter graph. it suggests a positive correlation . negative (falling). USA.[2] This kind of plot is also called a scatter chart. each having the value of one variable determining the position on the horizontal axis and the value of the other variable determining the position on the vertical axis. Correlations may be positive (rising). The measured or dependent variable is customarily "types" of eruptions: short-wait-short-duration. If no dependent variable exists. and long-wait-long-duration. or null (uncorrelated). it is called the control parameter or independent for the Old Faithful Geyser in Yellowstone National Park. If a parameter exists that is systematically incremented and/or decremented by the Waiting time between eruptions and the duration of the eruption other. either type of variable can be plotted on either axis and a scatter plot will illustrate only the degree of correlation (not causation) between two variables. plotted along the vertical axis. This chart suggests there are generally two axis. Overview A scatter plot is used when a variable exists that is under the control of the experimenter. scattergram. A scatter plot can suggest various kinds of correlations between variables with a certain confidence interval.Scatter plot 23 Scatter plot Scatter plot One of the Seven Basic Tools of Quality First described by Francis Galton Purpose To identify the type of relationship (if any) between two variables A scatter plot or scattergraph is a type of mathematical diagram using Cartesian coordinates to display values for two variables for a set of data. If the pattern of dots slopes from lower left to upper right. The data is displayed as a collection of points. variable and is customarily plotted along the horizontal Wyoming.

Example For example. A scatter plot is also very useful when we wish to see how two comparable data sets agree with each other. The researcher would then plot the data in a scatter plot. pp 166-167.7) in the Cartesian coordinates.com/wiki/What_is_a_scatterplot?) • Correlation scatter-plot matrix . the best-fit procedure is known as linear regression and is guaranteed to generate a correct solution in a finite time. References [1] Visualizations that have been created with VisIt (https:/ / wci. or an 1:1 line. llnl. gov/ codes/ visit/ gallery. Tague (2004). is often drawn as a reference. and "time holding breath" to the vertical axis. a researcher would choose a group of people to study.. [2] Utts. it suggests a negative correlation. if the two data sets are numerically identical.[3] A 3D scatter plot allows for the visualization of multivariate data of up to four dimensions. assigning "lung capacity" to the horizontal axis. Furthermore. the more the scatters tend to concentrate in the vicinity of the identity line.psychwiki. asq. Retrieved 2010-02-05. p. The scatter plot of all the people in the study would enable the researcher to obtain a visual comparison of the two variables in the data set.Scatter plot between the variables being studied. 24 One of the most powerful aspects of a scatter plot. these relationships will be visually evident as superimposed patterns. is its ability to show nonlinear relationships between variables. the scatters fall on the identity line exactly. at wci. . The Scatter plot takes multiple scalar variables and uses them for different axes in phase space. org/ learn-about-quality/ seven-basic-quality-tools/ overview/ overview. Jessica M.com/2010/04/ correlation-scatter-plot-matrix-for-ordered-categorical-data/) . "Seven Basic Quality Tools" (http:/ / www. Milwaukee. a y=x line. The different variables are combined to form coordinates in the phase space and they are displayed using glyphs and colored using [1] another scalar variable. The more the two data sets agree. if the data is represented by a mixture model of simple relationships. i. 15. 2007. A person with a lung capacity of 400 ml who held his breath for 21. The Quality Toolbox. A line of best fit (alternatively called 'trendline') can be drawn in order to study the correlation between the variables.llnl.for ordered-categorical data (http://www.r-statistics. an identity line. In this case. html). to display values for "lung capacity" (first variable) and how long that person could hold his breath. An equation for the correlation between the variables can be determined by established best-fit procedures. Wisconsin: American Society for Quality.e. then measure each one's lung capacity (first variable) and how long that person could hold his breath (second variable). and will help to determine what kind of relationship there might be between the two variables. html).7 seconds would be represented by a single dot on the scatter plot at the point (400. 2005. 21. For a linear correlation. No universal best-fit procedure is guaranteed to generate a correct solution for arbitrary relationships. however.gov. The scatter diagram is one of the seven basic tools of quality control. External links • What is a scatterplot? (http://www. If the pattern of dots slopes from upper left to lower right. Last updated: November 8. ISBN 0-534-39402-7 [3] Nancy R.Explanation and R code . Thomson Brooks/Cole. Seeing Through Statistics 3rd Edition.

since the disparity in population between north and south is so great that a sampling fraction based on the provincial sample as a whole might result in the collection of only a handful of data from the north.Scatter plot • Tool for visualizing scatter plots (http://www. It would be a misapplication of the technique to make subgroups' sample sizes proportional to the amount of data available from the subgroups. by means of an F Test). For example. In statistical surveys. based on their proportionality to the total population as mentioned above. If the respondents needed to reflect the diversity of the population. stratified sampling is a method of variance reduction when Monte Carlo methods are used to estimate population statistics from a known population. if population density varies greatly within a region.org) • Density scatterplot for large datasets (http://www. then the relative size of the two samples (three males.r-bloggers. The strata should also be collectively exhaustive: no population element can be excluded. Larger samples are taken in the strata with the greatest variability to generate the least possible sampling variance. it is advantageous to sample each subpopulation (stratum) independently. there is no way to make the subgroup sample . if the population consists of 60% in the male stratum and 40% in the female stratum. Stratified sampling strategies 1. This often improves the representativeness of the sample by reducing sampling error.Each stratum is proportionate to the standard deviation of the distribution of the variable. on the other hand.icanplot. two females) should reflect this proportion. Randomized stratification can also be used to improve population representativeness in a study. Similarly. For instance. Disadvantages Stratified sampling is not useful when the population cannot be exhaustively partitioned into disjoint subgroups. It can produce a weighted mean that has less variability than the arithmetic mean of a simple random sample of the population. Data representing each subgroup are taken to be of equal importance if suspected variation among them warrants stratified sampling. if known to vary significantly e. in Ontario a survey taken throughout the province might use a larger sampling fraction in the less populated north. among subgroups. that the data need to be stratified by variance. Stratification is the process of dividing members of the population into homogeneous subgroups before sampling. A stratified survey could thus claim to be more representative of the population than a survey of simple random sampling or systematic sampling. Then random or systematic sampling is applied within each stratum. and that comparisons of sub-regions can be made with equal statistical power.g. Proportionate allocation uses a sampling fraction in each of the strata that is proportional to that of the total population. the very variances vary so much. In computational statistics.com/ggplot2-for-big-data/) (hundreds of millions of points) 25 Stratified sampling In statistics. stratified sampling will ensure that estimates can be made with equal accuracy in different parts of the region. The strata should be mutually exclusive: every element in the population must be assigned to only one stratum. the researcher would specifically seek to include participants of various minority groups such as race or religion. If. when subpopulations within an overall population vary. 2. Optimum allocation (or Disproportionate allocation) . rather than scaling sample sizes to subgroup sizes (or to their variances. stratified sampling is a method of sampling from a population. A real-world example of using stratified sampling would be for a political survey.

part time. coventry. This is called proportional allocation. 50% of 40 is 20. • % male. 5% of 40 is 2. full time: 90 male. stratified according to the above categories. The first step is to find the total number of staff (180) and calculate the percentage in each group. part time = 18 x (40 / 180) = 4 female. (What is the most efficient way to partition sampling resources among groups that vary in both their means and their variances?) 26 Practical example In general the size of the sample in each stratum is taken in proportion to the size of the stratum.Stratified sampling sizes proportional (at the same time) to the subgroups' sizes within the total population. Another easy way without having to calculate the percentage is to multiply each group size by the sample size and divide by the total population size (size of entire staff): • • • • [1] male. full time. 5% should be female. 10% of 40 is 4. • • • • • • • • 50% should be male. part time: 18 female. part time = 63 x (40 / 180) = 14 References [1] http:/ / www. 35% of 40 is 14. part time = 18 / 180 = 10% • % female. full time. part time: 63 Total: 180 and we are asked to take a sample of 40 staff. full time: 9 female. 10% should be male. uk/ ec/ ~nhunt/ meths/ strati. full time = 9 x (40 / 180) = 2 female. full time = 90 x (40 / 180) = 20 male. full time = 90 / 180 = 50% • % male. part time = 63 / 180 = 35% This tells us that of our sample of 40. 35% should be female. ac. part time. Suppose that in a company there are the following staff: • • • • • male. html Accessed 2008/01/27 . full time = 9 / 180 = 5% • % female.

• Safety-based RCA descends from the fields of accident analysis and occupational safety and health. While one follows the other. However.) 2.Root cause analysis 27 Root cause analysis Root cause analysis (RCA) is a class of problem solving methods aimed at identifying the root causes of problems or events. the location. Appendix B. several very-broadly defined approaches or "schools" can be identified by their basic approach or field of origin: safety-based. in the U. RCA is often considered to be an iterative process. or conditions need to be changed to prevent recurrence of similar harmful outcomes and to identify the lessons to be learned to promote the achievement of better consequences. RCA is typically used as a reactive method of identifying event(s) causes. However. it is recognized that complete prevention of recurrence by one corrective action is not always possible. Analysis is done after an event has occurred. and systems-based.S. General principles of root cause analysis 1. Despite the different approaches among the various schools of root cause analysis. it is more probable that problem recurrence will be prevented. usually as part of an investigation. inactions. and philosophies for performing RCA analysis. inactions. production-based. or conditions need to be changed to prevent recurrence of similar harmful outcomes and to identify the lessons to be learned to promote the achievement of better consequences. and systems analysis. with conclusions and root causes identified backed up by documented evidence. • Production-based RCA has its origins in the field of quality control for industrial manufacturing. actions. and is frequently viewed as a tool of continuous improvement. sharply defined methodology. By directing corrective measures at root causes. Conversely. The practice of RCA is predicated on the belief that problems are best solved by attempting to address. the location. The primary aim of RCA is to identify the factors that resulted in the nature. RCA must be performed systematically. Nevertheless. In that event. Criterion XVI. ("Success" is defined as the near-certain prevention of recurrence. and the timing of the harmful outcomes (consequences) of one or more past events in order to identify what behaviors. there are some common principles. and the timing of the harmful outcomes (consequences) of one or more past events in order to identify what behaviors. RCA is a completely separate process to Incident Management. Usually a team effort is required. risk management." [10CFR50. • Systems-based RCA has emerged as an amalgamation of the preceding schools. along with ideas taken from fields such as change management. • Failure-based RCA is rooted in the practice of failure analysis as employed in engineering and maintenance. Root cause analysis is not a single. • Process-based RCA is basically a follow-on to production-based RCA. Thus. nuclear power industry the NRC requires that "In the case of significant conditions adverse to quality. correct or eliminate root causes. It is also possible to define several general processes for performing RCA. but with a scope that has been expanded to include business processes. To be effective. process-based. failure-based. the magnitude. Sentence 2)] In practice more than one "cause" is allowed and more than one corrective action is not forbidden. RCA can be used to forecast or predict probable events even before they occur. processes. the measures shall assure that the cause of the condition is determined and corrective action taken to prevent repetition. . actions. as opposed to merely addressing the immediately obvious symptoms. the magnitude. there are many different tools. Root Cause Analysis is any structured approach to identifying the factors that resulted in the nature. there may be several effective measures (methods) that address the root causes of a problem. revealing problems and solving them. Insights in RCA may make it useful as a pro-active method.

4. Threats to cultures often meet with resistance. or even required. First: Is it readable? If it is readable it will be grammatically correct. More importantly. General process for performing and documenting an RCA-based Corrective Action Notice that RCA (in steps 3. a "non-punitory" policy towards problem identifiers may be required. have reduced . the locations. we cannot determine what an effective corrective action for the defined problem will be. The purpose of identifying all solutions to a problem is to prevent recurrence at lowest cost in the simplest way. 3. Check that each corrective action would. classifying that along a timeline of events to the final failure or crisis. For every behavior. Show that the important attributes of the harmful outcomes were completely explained by the deepest conditions. 8. behaviors. Ask "why" and identify the causes associated with each step in the sequence towards the defined problem or event. If there are alternatives that are equally effective. then the simplest or lowest cost approach is preferred. Root cause analysis can help to transform a reactive culture (that reacts to problems) into a forward-looking culture that solves problems before they occur or escalate. and inactions. if pre-implemented before the event. 2. because it directs the corrective action at the true root cause of the problem. Effective problem statements and event descriptions (as failures." 6. Root causes identified depend on the way in which the problem or event is defined. Classify causes into causal factors that relate to an event in the sequence. the difficult part is demonstrating the persistence and sustaining the effort required to develop them. Gather data and evidence. like other 'deliverables' can vary in quality. If there are multiple root causes. 28 Evaluating s root cause analysis Root Cause Analysis Reports. and inactions. which is often the case. actions. Define the problem or describe the event factually. There are some general possiblilities for evaluating root cause analysis outputs. 4 and 5) forms the most critical part of successful corrective action. 1. and the timings. 7. and the like. To be effective. terms will be defined. 5. condition. it reduces the frequency of problems occurring over time within the environment where the RCA process is used. 2. 6. actions. reveal those clearly for later optimum selection. and inaction specify in the "timeline" what should have been when it differs from the actual. Second: Does it contain a complete set of all of the causal relationships? If it did contain a "complete set of all of the causal relationships" one could (at least): 1. and root causes. The root cause is secondary to the goal of prevention. 5.Root cause analysis 3. but without knowing the root cause. it will be free on internal inconsistencies. There may be more than one root cause for an event or a problem. for example) are helpful. that if applied can be agreed to have interrupted that step of the sequence chain. it will contain appropriate graphics. root cause(s) and the defined problem or event to prevent in the future. Include the qualitative and quantitative attributes (properties) of the harmful outcomes. There may be other forms of management support required to achieve RCA effectiveness and success. RCA is a threat to many cultures and environments. action. Trace the causal relationships from the harmful outcomes to the deepest conditions. the sentences will make sense. This usually includes specifying the natures. Identify corrective action(s) that will with certainty prevent recurrence of each harmful effect. the magnitudes. "Why" is taken to mean "What were the factors that directly resulted in the effect?" 4. Each stakeholder can have their own qualitative and quantitative acceptance criteria. including outcomes and factors. behaviors. For example. identify all other harmful factors that have equal or better claim to be called "root causes. the analysis should establish a sequence of events or timeline to understand the relationships between contributory (causal) factors.

and a deep understanding of the root cause(s) is desired.An ITIL-aligned method for diagnosing IT problems. unforeseen problems. . • Ishikawa diagrams emphasize initial breadth. An adverse event can be viewed as a project whose final product was harm. • Pareto analysis "80/20 rule" • RPR Problem Diagnosis . to identify how and why the barriers did not prevent the energy flows from causing harm. Common cause analysis are sometimes required as part of the safety engineering tasks for theme parks. It is based on tracing energy flows. Other methodologies for problem solving and problem avoidance may be useful. 29 Root cause analysis techniques • Re-enactment-for example having the participants in the event do it over the way it was done (with due care to avoid the same harmful outcomes). • Re-construction-reassembling all of the available accident debris to see clues as to how the disassembly occurred. prevent recurrence with reasonable certainty with consensus agreement of the group. 9. software or highly integrated systems interaction may contribute to human error or improper operation of a system. • Why-Because analysis emphasizes recursive breadth. Systems are analyzed for root causes and causal factors to determine probability of failure modes. using the concepts of necessary and sufficient causes. • Barrier analysis . 7. Identify and address the other instances of each harmful outcome and harmful factor. The CRT begins with a brief list of the undesirables things we see around us. It is based on comparing a situation that does not exhibit the problem to one that does. • Re-enactment using a computer or a simulator. • "Delta Work"-comparing the way an episode did happen with the way it was intended to happen. Goldratt in his theory of constraints that guides an investigator to identify and relate all root causes using a cause-effect tree whose elements are bound by rules of logic (Categories of Legitimate Reservation).a technique often used in process industries. • Kepner-Tregoe Approach • Project Management Approaches. Ensure effectiveness by observing the implemented recommendation solutions. and Planning Logic Network. meet your goals and objectives and do not cause introduce other new. Gantt Chart. Also ensuring complete testing and verification are methods used for ensuring complex systems are designed with no common causes that cause severe hazards. fault modes. • Current Reality Tree . This method is particularly powerful when the system is complex. • Bayesian inference • Change analysis . Implement the recommended root cause correction(s). spacecraft. or common mode software faults due to escaped requirements. with a focus on barriers to those flows. using a checklist of types of causes that should be considered. • Failure mode and effects analysis • Fault tree analysis • Five whys emphasizes recursive depth. using the heuristic that you're probably not done until you've looked five levels deep.Root cause analysis or prevented specific harmful effects. commercial/military aircraft. are within your control. 11. Identify solutions that effective.an investigation technique often used for problems or accidents. complex control systems. and then guides us towards one or more root causes. there is no obvious link between the observed undesirable things.A method developed by Eliahu M. Common cause analysis (CCA) common modes analysis (CMA) are evolving engineering techniques for complex technical systems to determine if common root causes in hardware. 10. The event can be understood by re-casting it in the classical Project Management devices such as Work Breakdown Structure. in order to identify the changes or differences that might explain why the problem occurred. • Comparative re-enactment-doing it over the right way as well as the way it was actually done. 8.

. nuclear power plants.Root cause analysis large electrical utility grids. automated industrial controls. medical devices or other safety safety-critical systems with complex functionality. inattention. and erroneous root cause analyses on individual events. 30 Basic elements of root cause using Management Oversight Risk Tree (MORT) Approach Classification • Materials • Defective raw material • Wrong type for job • Lack of raw material • Man Power • • • • Inadequate capability Lack of Knowledge Lack of skill Stress • Improper motivation • Machine / Equipment • Incorrect tool selection • Poor maintenance or design • Poor equipment or tool placement • Defective equipment or tool • Environment • Disordered workplace • Poor job design and/or layout of work • Surfaces poorly maintained • Inability to meet physical demands of the task • Forces of nature • Management • Lack of management involvement • Inattention to task • Task hazards not dealt with properly • Other (horseplay. ineffective.) • Stress demands • Lack of Process • Lack of Communication • Methods • No or poor procedures • Practices are not the same as written procedures • Poor communication • Management system • Training or education lacking • Poor employee involvement • Poor recognition of hazard • Previously identified hazards were not eliminated .. A major issue with common cause analysis is that it often depends on previously completed weak..

. as well as a tabular format[3] . . Tendency to isolate a single root cause. com/ tps. Results aren't repeatable .different people using 5 Whys come up with different causes for the same problem. Retrieved 6 March 2010. These can be significant problems when the method is applied through deduction only. [2] "An Introduction to 5-why" (http:/ / blog. bulsuk. html). On-the-spot verification of the answer to the current "why" question. Portland. Toyota production system: beyond large-scale production. Lack of support to help the investigator to ask the right "why" questions. Retrieved 25 December 2010. Reasons for this criticism include: − − − • • • • • Tendency for investigators to stop at symptoms rather than going on to lower level root causes. [4] "The "Thinking" Production System: TPS as a winning strategy for developing people in the global manufacturing environment" (http:/ / www. before proceeding to the next. as being too basic a tool to analyze root causes to the depth that is needed to ensure that the causes are fixed [4] .can't find causes that they don't already know. These tools allow for analysis to be branched in order to provide multiple root causes for remedy. Criticism While the 5 Whys is a powerful tool for engineers or technically savvy individuals to help get to the true causes of problems. bulsuk. toyotageorgetown. References [1] Taiichi Ohno. Or: Productivity Press.5 Whys 32 Techniques There are two primary techniques used to perform 5 whys[2] : the fishbone (or Ishikawa) diagram. former managing director of global purchasing for Toyota. com/ 2009/ 03/ 5-why-finding-root-causes. html). it has been criticized by Teruyuki Minoura. ISBN 0-915299-14-3. is recommended as a good practice to avoid these issues. Inability to go beyond the investigator's current knowledge . [3] "5-why Analysis using an Excel Spreadsheet Table" (http:/ / blog. . com/ 2009/ 07/ 5-why-analysis-using-table. whereas each question could elicit many different root causes. Retrieved 2011-02-02. . asp). foreword by Norman Bodek (1988).

Next comes an iterative process to determine causes. and the totality of causes must be sufficient: it gives the causes. The result of a WBA is a why–because graph (WBG). It is independent of application domain and has been used to analyse. Only if it is necessary for the cause in question then it is clearly contributing to the effect. and nothing is superfluous (CT: each cause is necessary). The formal tests The counterfactual test (CT) – The CT leads back to David Lewis formal notion of causality and counterfactuals. formal tests are applied to all potential cause-effect relations. railway-. the whole causes (sufficient). until a satisfactory result has been achieved. marine.Whybecause analysis 33 Why–because analysis Why–because analysis (WBA) is a method for accident analysis. Directed edges denote cause–effect relations between the factors. It is a directed acyclic graph where the nodes of the graph are factors. aviation-. The CT proves or disproves that a cause is a necessary causal factor for an effect. The missing of causes can thus be identified.and computer related accidents and incidents. . could the effect have happened". and the totality of causes must be sufficient (CST): nothing is omitted (CST: the listed causes are sufficient). falsifiability and reproducibility of results. It is mainly used as an after the fact (or a posteriori) analysis method. Only if for all causal relations the CT is positive and for all sets of causes to their effects the CST is positive the WBG is correct: each cause must be necessary (CT). WBA in detail WBA starts with the question what the accident(s) in question is(are). At each node (factor). WBA strives to ensure objectivity. among others. The CST aims at deciding whether a set of causes are sufficient for an effect to happen. and so on. each contributing cause (related factor) must have been necessary. This process can be iterated for the new found causes. The CT asks the following question: "If the cause had not been. and nothing but the causes (necessary). In most cases this is easy to define. When causes for the accident have been identified. The causal sufficiency test – The CST asks the question: "Will an effect always happen if all attributed causes happen?". The WBG depicts causal relations between factors of an accident.

de/ research/ WBA/ .Whybecause analysis 34 Example Partial Why–because graph of the capsizing of the Herald of Free Enterprise External links • Why-Because Analysis [1] (WBA) References [1] http:/ / www. uni-bielefeld. rvs.

engines) wanted a methodology where teams (design engineering. The manual and subsequent course material was piloted at World Headquarters in Dearborn. manufacturing engineering. while the military also developed and quantified their own process during World War II. It was never based on any military standard or other existing problem solving methodology. why. This has been Ford's approach to problem solving ever since. and production) could work on recurring problems. The manual for this methodology was documented and defined in "Team Oriented Problem Solving"(TOPS). Many changes and revisions were made based on feedback from the pilot sessions. In 1986.Eight Disciplines Problem Solving 35 Eight Disciplines Problem Solving Eight Disciplines Problem Solving is a method used to approach and to resolve problems. based on the use of statistical methods of data analysis. and how many (5W2H) for the problem. D5: Choose and verify Permanent Corrections (PCs) for Problem/Non Conformity: Through pre-production programs quantitatively confirm that the selected correction will resolve the problem for the customer. D4: Determine and Identify and Verify Root Causes and escape points: Identify all applicable causes that could explain why the problem has occurred. One can use 5Whys or Ishikawa Diagram to map causes against effect/Problem identified. The team needs to be formally thanked by the organization.[1] [2] 8D has become a standard in the Auto. D1: Use a Team: Establish a team of people with product/process knowledge. the assignment was given to develop a manual and a subsequent course that would achieve a new approach to solving tough engineering design and manufacturing problems. how. . not determined by fuzzy brainstorming. D2: Define and describe the Problem: Specify the problem by identifying in quantifiable terms the who. Ford's Perspective The development of a Team Oriented Problem Solving strategy. first published in 1987. The material is extensive and the 8D titles are merely the chapter headings for each step in the process. was developed at Ford Motor Company. All causes shall be verified or proved. typically employed by quality engineers or other professionals. D8: Congratulate your Team: Recognize the collective efforts of the team. History Ford Motor Company developed a method. where. The executives of the Powertrain Organization (transmissions. D7: Prevent recurrence/Corrective Actions: Modify the management systems. Both of these methods revolve around the Eight Disciplines. and procedures to prevent recurrence of this and all similar problems. Ford also refers to their current variant as G8D (Global 8D). operation systems. D0: The Planning Phase: Plan for solving the problem and determine the prerequisites. chassis. practices. when. Michigan. Also identify why the problem has not been noticed at the time it occurred. D6: Implement and validate PCAs: Define and Implement the best corrective actions. Assembly and other industries that require a thorough structured problem solving process using a team approach. what. D3: Developing Interim Containment Plan Implement and verify Interim Actions: Define and implement containment actions to isolate the problem from any customer.

Usage Many disciplines are typically involved in the "8D" process. [4] • The 8 Disciplines (8D) Process [5] • Laurie Rambaud (2006). Ford developed a revised version of the 8D process. The assessing questions are meant to ensure that in a world of limited problem-solving resources. correct and eliminate recurring problems. but equally importantly.Eight Disciplines Problem Solving 36 Military Usage The US Government first standardized a process during the Second World War as Military Standard 1520 'Corrective Action and Disposition System for Nonconforming Material' . and a "Fishbone Diagram" or "5 Why Analysis" are common tools employed at step D4. Global 8D requires the team to identify and verify this Escape Point (defined as the earliest control point in the control system following the Root Cause that should have detected the problem but failed to do so) at D4. the 8D process has been employed significantly outside the auto industry. As part of Lean initiatives and Continuous Improvement Processes it is employed extensively within Food Manufacturing. High Tech. 8D Structured Problem Solving: A Guide to Creating High Quality 8D Reports. that they call "Global 8D" (G8D) which is the current global standard for Ford and many other companies in the automotive supply chain. Recently. corrective and preventive actions in 8D Report [8] . PHRED Solutions [6]. The methodology is useful in product and process improvement.[3] This military standard focused on nonconforming material and the disposition of the material. It focuses on the origin of the problem by determining Root Causes. the team documents the symptoms that initiated the effort along with any Emergency Response Actions (ERAs) that were taken before formal initiation of the G8D. and validate Permanent Corrective Actions to address the Escape Point. For example. At D0. The major revisions to the process are as follows: • Addition of a D0 (D-Zero) step as a gateway to the process. This 'Determine a Root Cause' step is a part of the military usage of the 8D's but was not a reference in the development of the 8D problem solving methodology and is not referenced or included in the TOPS manual or course. It establishes a permanent corrective action based on statistical analysis of the problem. through D5 and D6. Then. ISBN 0-9790553-0-X • Eight Disciplines (8D) Problem solving [7] • Difference between containment. The 8D Problem Solving Process is used to identify. what went wrong with the control system in allowing this problem to escape. an "Is/Is Not" worksheet is a common tool employed at D2. External links • Society of Manufacturing Engineers: SME. verify. The idea here is to consider not only the Root cause of a problem. all of which can be found in various textbooks and reference materials used by Quality Assurance professionals. implement. the efforts required for a full team-based problem-solving effort are limited to those problems that warrant these resources. In the late 1990s. and Health Care industries. D0 also incorporates standard assessing questions meant to determine whether a full G8D is required. the process requires the team to choose. • Addition of Escape Point to D4 through D6.

com/ MIL-STD/ MIL-STD+ (1500+ -+ 1599)/ MIL_STD_1520C_NOT_1_1407 http:/ / www. com/ 8D. com/ 8D/ http:/ / www. pl?LEAN& 20030116& 1& http:/ / www. 12manage. com http:/ / elsmar.Eight Disciplines Problem Solving 37 References [1] [2] [3] [4] [5] [6] [7] [8] http:/ / www. everyspec. com/ methods_ford_eight_disciplines_8D. htm http:/ / www. siliconfareast. com/ 8D. 8dreport. org/ cgi-bin/ get-newsletter. html http:/ / www. htm http:/ / www. com/ articles/ difference-between-containment-corrective-and-preventive-actions-in-8d-report/ . sme. phredsolutions. siliconfareast.