Professional Documents
Culture Documents
W. Edwards. Deming
Joseph Juran
Philip Crosby
Malcolm Baldridge
Page 1 of 29
By Prabhat Khare (prabhatkhare22659@gmail.com)
Edward Deming
Born in 1900, Edwards Deming was an American engineer, professor, statistician, lecturer, author, and
management consultant.
Deming opined that by embracing certain principles of the management, organizations can improve the quality
of the product and concurrently reduce costs. Reduction of costs would include the reduction of waste
production, reducing staff attrition and litigation while simultaneously increasing customer loyalty. The key, in
Deming’s opinion, was to practice constant improvement, and to imagine the manufacturing process as a
seamless whole, rather than as a system made up of incongruent parts.
In the 1970s, some of Deming's Japanese proponents summarized his philosophy in a two-part comparison:
Organizations should focus primarily on quality, which is defined by the equation ‘Quality = Results of
work efforts/total costs. When this occurs, quality improves, and costs plummet, over time.
When organizations' focus is primarily on costs, the costs will rise, but over time the quality drops.
Also known as the Shewhart Cycle, the Deming Cycle, often called the PDCA, was a result of the need to link the
manufacture of products with the needs of the consumer along with focusing departmental resources in a
collegial effort to meet those needs.
Plan: Design a consumer research methodology which will inform business process components.
Do: Implement the plan to measure its performance.
Check: Check the measurements and report the findings to the decision makers
Act/Adjust: Draw a conclusion on the changes that need to be made and implement them.
Page 2 of 29
By Prabhat Khare (prabhatkhare22659@gmail.com)
The 14 Points for Management
Deming’s other chief contribution came in the form of his 14 Points for Management, which consists of a set of
guidelines for managers looking to transform business effectiveness.
The 7 Deadly Diseases for Management defined by Deming are the most serious and fatal barriers that
managements face, in attempting to increase effectiveness and institute continual improvement.
Page 3 of 29
By Prabhat Khare (prabhatkhare22659@gmail.com)
Joseph Juran
Born in 1904, Joseph Juran was a Romanian-born American engineer and management consultant of the 20th
century, and a missionary for quality and quality management. Like Deming, Juran's philosophy also took root
in Japan. He stressed on the importance of a broad, organizational-level approach to quality – stating that total
quality management begins from the highest position in the management and continues all the way to the
bottom.
In 1941, Juran was introduced to the work of Vilfredo Pareto. He studied the Pareto principle (the 80-20 law),
which states that, for many events, roughly 80% of the effects follow from 20% of the causes and applied the
concept to quality issues. Thus, according to Juran, 80% of the problems in an organization are caused by 20%
of the causes. This is also known as the rule of the "Vital few and the Trivial Many". Juran, in his later years,
preferred "the Vital Few and the Useful Many" suggesting that the remaining 80% of the causes must not be
completely ignored.
Pareto analysis is a formal technique useful where many possible courses of action are competing for attention.
In essence, the problem-solver estimates the benefit delivered by each action, then selects a number of the most
effective actions that deliver a total benefit reasonably close to the maximal possible one.
Pareto analysis is a creative way of looking at causes of problems because it helps stimulate thinking and
organize thoughts. However, it can be limited by its exclusion of possibly important problems which may be
small initially, but which grow with time. It should be combined with other analytical tools such as failure mode
and effects analysis and fault tree analysis for example.
This technique helps to identify the top portion of causes that need to be addressed to resolve the majority of
problems. Once the predominant causes are identified, then tools like the Ishikawa diagram or Fish-bone
Analysis can be used to identify the root causes of the problems. While it is common to refer to Pareto as
"80/20" rule, under the assumption that, in all situations, 20% of causes determine 80% of problems, this ratio
is merely a convenient rule of thumb and is not nor should it be considered an immutable law of nature.
The application of the Pareto analysis in risk management allows management to focus on those risks that have
the most impact on the project.
His approach to quality management drew one outside the walls of a factory and into the non-manufacturing
processes of the organization, especially those that were service related.
Page 4 of 29
By Prabhat Khare (prabhatkhare22659@gmail.com)
The Juran Quality Trilogy
One of the first to write about the cost of poor quality, Juran developed an approach for cross-functional
management that comprises three legislative processes:
Quality Planning: this is a process that involves creating awareness of the necessity to improve, setting certain
goals and planning ways to reach those goals. This process has its roots in the management's commitment to
planned change that requires trained and qualified staff.
Quality Control: this is a process to develop the methods to test the products for their quality. Deviation from
the standard will require change and improvement.
Quality Improvement: this is a process that involves the constant drive to perfection. Quality improvements
need to be continuously introduced. Problems must be diagnosed to the root causes to develop solutions. The
Management must analyze the processes and the systems and report back with recognition and praise when
things are done right.
Juran Trilogy
1. Accomplish Improvements That Are Structured On A Regular Basis With Commitment And A Sense Of
Urgency.
2. Build An Extensive Training Program.
3. Cultivate Commitment And Leadership At The Higher Echelons Of Management.
1. Establish Awareness For The Need To Improve And The Opportunities For Improvement.
2. Set Goals for Improvement.
3. Organize To Meet The Goals That Have Been Set.
4. Provide Training.
5. Implement Projects Aimed At Solving Problems.
6. Report Progress.
7. Give Recognition.
8. Communicate Results.
9. Keep Score.
10. Maintain Momentum By Building Improvement Into The Company's Regular Systems.
Page 5 of 29
By Prabhat Khare (prabhatkhare22659@gmail.com)
Philip Crosby
Born in 1926, Philip Crosby was an author and businessman who contributed to management theory and
quality management practices. He started his career in quality much later than Deming and Juran. He founded
Philip Crosby and Associates, which was an international consulting firm on quality improvement.
Crosby's principle, Doing It Right the First Time, was his answer to the quality crisis. He defined quality as full
and perfect conformance to the customers' requirements. The essence of his philosophy is expressed in what
he called the Absolutes of Quality Management and the Basic Elements of Improvement.
Zero Defects
Crosby's Zero Defects is a performance method and standard that states that people should commit themselves
to closely monitoring details and avoid errors. By doing this, they move closer to the zero defects goal.
According to Crosby, zero defects was not just a manufacturing principle, but was an all-pervading philosophy
that ought to influence every decision that we make. Managerial notions of defects being unacceptable and
everyone doing ‘things right the first time’ are reinforced.
1. Make it clear that management is committed to quality for the long term.
2. Form cross-departmental quality teams.
3. Identify where current and potential problems exist.
4. Assess the cost of quality and explain how it is used as a management tool.
5. Improve the quality awareness and personal commitment of all employees.
6. Take immediate action to correct problems identified.
7. Establish a zero-defect program.
8. Train supervisors to carry out their responsibilities in the quality program.
9. Hold a Zero Defects Day to ensure all employees are aware there is a new direction.
10. Encourage individuals and teams to establish both personal and team improvements.
11. Encourage employees to tell management about obstacles they face in trying to meet quality goals.
12. Recognize employees who participate.
13. Implement quality controls to promote continual communication.
14. Repeat everything to illustrate that quality improvement is a never-ending process.
Page 6 of 29
By Prabhat Khare (prabhatkhare22659@gmail.com)
The Quality Vaccine
Crosby explained that this vaccination was the medicine for organizations to prevent poor quality.
Integrity: Quality must be taken seriously throughout the entire organization, from the highest levels to the
lowest. The company's future will be judged by the quality it delivers.
Systems: The right measures and systems are necessary for quality costs, performance, education,
improvement, review, and customer satisfaction.
Operations: a culture of improvement should be the norm in any organization, and the process should be solid.
Policies: policies that are implemented should be consistent and clear throughout the organization.
Page 7 of 29
By Prabhat Khare (prabhatkhare22659@gmail.com)
Malcolm Baldrige
Who was he?
Baldrige was born on October 4, 1922 in Omaha, Nebraska. He attended The Hotchkiss School and Yale
University. At Yale, he was a member of a Delta Kappa Epsilon.
Baldrige began his career in the manufacturing industry in 1947, as the foundry hand in an iron
company in Connecticut and rose to the presidency of that company by 1960. During World War II,
Baldrige served in combat in the Pacific as Captain in the 27th Infantry Division. Prior to entering the
Cabinet, Baldrige was chairman and chief executive officer of Waterbury, Connecticut-based brass
company Scovill, Inc. Having joined Scovill in 1962, he is credited with leading its transformation to a
highly diversified manufacturer of consumer, housing and industrial goods from a financially troubled
brass mill.
1. To Help Organizations Assess Their Improvement Efforts, Diagnose Their Overall Performance
Management System, And Identify Their Strengths And Opportunities For Improvement And
2. To Identify Baldrige Award Recipients That Will Serve As Role Models For Other Organizations.
The criteria for performance excellence are based on a set of core values:
1. Systems Perspective
2. Visionary Leadership
3. Customer-Focused Excellence
4. Valuing People
5. Organizational Learning And Agility
6. Focus On Success
7. Managing For Innovation
8. Management By Fact
9. Societal Responsibility
10. Ethics And Transparency
11. Delivering Value And results
Page 8 of 29
By Prabhat Khare (prabhatkhare22659@gmail.com)
The questions that make up the criteria represent seven aspects of organizational management and
performance:
1. Leadership
2. Strategy
3. Customers
4. Measurement, Analysis, And Knowledge Management
5. Workforce
6. Operations
7. Results
The three sector-specific versions of the Baldrige framework are revised every two years:
Page 9 of 29
By Prabhat Khare (prabhatkhare22659@gmail.com)
Fishbone (Ishikawa) Diagram
Variations: cause enumeration diagram, process fishbone, time–delay fishbone, CEDAC (cause–and–effect
diagram with the addition of cards), desired–result fishbone, reverse fishbone diagram
The fishbone diagram identifies many possible causes for an effect or problem. It can be used to structure a
brainstorming session. It immediately sorts ideas into useful categories.
Page 10 of 29
By Prabhat Khare (prabhatkhare22659@gmail.com)
Fishbone Diagram Example
This fishbone diagram was drawn by a manufacturing team to try to understand the source of periodic iron
contamination. The team used the six generic headings to prompt ideas. Layers of branches show thorough
thinking about the causes of the problem.
For example, under the heading “Machines,” the idea “materials of construction” shows four kinds of equipment
and then several specific machine numbers.
Note that some ideas appear in two different places. “Calibration” shows up under “Methods” as a factor in the
analytical procedure, and also under “Measurement” as a cause of lab error. “Iron tools” can be considered a
“Methods” problem when taking samples or a “Manpower” problem with maintenance personnel.
Page 11 of 29
By Prabhat Khare (prabhatkhare22659@gmail.com)
Seven Basic Tools of Quality
The Seven Basic Tools of Quality is a designation given to a fixed set of graphical techniques identified as being
most helpful in troubleshooting issues related to quality. They are called basic because they are suitable for
people with little formal training in statistics and because they can be used to solve the vast majority of quality-
related issues.
The designation arose in post war Japan, inspired by the seven famous weapons of Benkei. It was possibly
introduced by Kaoru Ishikawa who in turn was influenced by a series of lectures W. Edwards Deming had given
to Japanese engineers and scientists in 1950. At that time, companies that had set about training their
workforces in statistical quality control found that the complexity of the subject intimidated the vast majority
of their workers and scaled back training to focus primarily on simpler methods which suffice for most quality-
related issues.
The Seven Basic Tools stand in contrast to more advanced statistical methods such as survey sampling,
acceptance sampling, statistical hypothesis testing, design of experiments, multivariate analysis, and various
methods developed in the field of operations research.
Page 12 of 29
By Prabhat Khare (prabhatkhare22659@gmail.com)
Examples
1. Cause-and-effect diagram 2. Check sheet
Page 13 of 29
By Prabhat Khare (prabhatkhare22659@gmail.com)
Six Sigma
Six Sigma is a disciplined, statistical-based, data-driven approach and continuous improvement methodology
for eliminating defects in a product, process or service. It was developed by Motorola in early to middle 1980’s
based on quality management fundamentals, then became a popular management approach at General Electric
(GE) in the early 1990’s. Hundreds of companies around the world have adopted Six Sigma as a way of doing
business.
Sigma represents the population standard deviation, which is a measure of the variation in a data set collected
about the process. If a defect is defined by specification limits separating good from bad outcomes of a process,
then a six sigma process has a process mean (average) that is six standard deviations from the nearest
specification limit. This provides enough buffer between the process natural variation and the specification
limits.
For example, if a product must have a thickness between 10.32 and 10.38 cms to meet customer requirements,
then the process mean should be around 10.35, with a standard deviation less than 0.005 (10.38 would be 6
standard deviations away from 10.35).
Six Sigma can also be thought of as a measure of process performance, with Six Sigma being the goal, based on
the defects per million. Once the current performance of the process is measured, the goal is to continually
improve the sigma level striving towards 6 sigma. Even if the improvements do not reach 6 sigma, the
improvements made from 3 sigma to 4 sigma to 5 sigma will still reduce costs and increase customer
satisfaction.
Page 14 of 29
By Prabhat Khare (prabhatkhare22659@gmail.com)
Statistical Process Control (SPC)
Cm (capability machine)
Page 15 of 29
By Prabhat Khare (prabhatkhare22659@gmail.com)
Cp (capability process)
These are the factors that are generally regarded as causing variation in capability measurements:
Page 16 of 29
By Prabhat Khare (prabhatkhare22659@gmail.com)
Capability
Machine capability
Page 17 of 29
By Prabhat Khare (prabhatkhare22659@gmail.com)
Standard deviation
Target value
Page 18 of 29
By Prabhat Khare (prabhatkhare22659@gmail.com)
Normal distribution curve
Page 19 of 29
By Prabhat Khare (prabhatkhare22659@gmail.com)
Six Sigma
Page 20 of 29
By Prabhat Khare (prabhatkhare22659@gmail.com)
Control limits
Page 21 of 29
By Prabhat Khare (prabhatkhare22659@gmail.com)
How are control limits determined?
Page 22 of 29
By Prabhat Khare (prabhatkhare22659@gmail.com)
Cp, Cpk, Pp and Ppk: Know How and When
to Use Them
For many years industries have used Cp, Cpk, P and Ppk as statistical measures of process quality
capability. Some segments in manufacturing have specified minimal requirements for these parameters,
even for some of their key documents, such as advanced product quality planning and ISO/TS-16949. Six
Sigma, however, suggests a different evaluation of process capability by measuring against a sigma level,
also known as sigma capability.
Incorporating metrics that differ from traditional ones may lead some companies to wonder about the
necessity and adaptation of these metrics. It is important to emphasize that traditional capability studies
as well as the use of sigma capability measures carry a similar purpose. Once the process is under
statistical control and showing only normal causes, it is predictable. This is when it becomes interesting
for companies to predict the current process’s probability of meeting customer specifications or
requirements.
Capability Studies
Traditional capability rates are calculated when a product or service feature is measured through a
quantitative continuous variable, assuming the data follows a normal probability distribution. A normal
distribution features the measurement of a mean and a standard deviation, making it possible to
estimate the probability of an incident within any data set.
The most interesting values relate to the probability of data occurring outside of customer
specifications. These are data appearing below the lower specification limit (LSL) or above the upper
specification limit (USL). An ordinary mistake lies in using capability studies to deal with categorical
data, turning the data into rates or percentiles. In such cases, determining specification limits becomes
complex. For example, a billing process may generate correct or incorrect invoices. These represent
categorical variables, which by definition carry an ideal USL of 100 percent error free processing,
rendering the traditional statistical measures (Cp, Cpk, Pp and Ppk) inapplicable to categorical variables.
When working with continuous variables, the traditional statistical measures are quite useful,
especially in manufacturing. The difference between capability rates (Cp and Cpk) and performance
rates (Pp and Ppk) is the method of estimating the statistical population standard deviation. The
difference between the centralized rates (Cp and Pp) and unilateral rates (Cpk and Ppk) is the impact of
the mean decentralization over process performance estimates.
The following example details the impact that the different forms of calculating capability may have over
the study results of a process. A company manufactures a product that’s acceptable dimensions,
previously specified by the customer, range from 155 mm to 157 mm. The first 10 parts made by a
machine that manufactures the product and works during one period only were collected as samples
during a period of 28 days. Evaluation data taken from these parts was used to make a Xbar-S control
chart (Figure 1).
Page 23 of 29
By Prabhat Khare (prabhatkhare22659@gmail.com)
Figure 1: Xbar-S Control Chart of Evaluation Data
This chart presents only common cause variation and as such, leads to the conclusion that the process
is predictable. Calculation of process capability presents the results in Figure 2.
Page 24 of 29
By Prabhat Khare (prabhatkhare22659@gmail.com)
Calculating Cp
The Cp rate of capability is calculated from the formula:
where s represents the standard deviation for a population taken from , with s-
bar representing the mean of deviation for each rational subgroup and c 4 representing a statistical
coefficient of correction.
In this case, the formula considers the quantity of variation given by standard deviation and an
acceptable gap allowed by specified limits despite the mean. The results reflect the population’s standard
deviation, estimated from the mean of the standard deviations within the subgroups as 0.413258, which
generates a Cp of 0.81.
Rational Subgroups
A rational subgroup is a concept developed by Shewart while he was defining control graphics. It
consists of a sample in which the differences in the data within a subgroup are minimized and the
differences between groups are maximized. This allows a clearer identification of how the process
parameters change along a time continuum. In the example above, the process used to collect the
samples allows consideration of each daily collection as a particular rational subgroup.
In this case, besides the variation in quantity, the process mean also affects the indicators. Because the
process is not perfectly centralized, the mean is closer to one of the limits and, as a consequence,
presents a higher possibility of not reaching the process capability targets. In the example above,
specification limits are defined as 155 mm and 157 mm. The mean (155.74) is closer to one of them than
to the other, leading to a Cpk factor (0.60) that is lower than the Cp value (0.81). This implies that the LSL
is more difficult to achieve than the USL. Non-conformities exist at both ends of the histogram.
Page 25 of 29
By Prabhat Khare (prabhatkhare22659@gmail.com)
Estimating Pp
Estimating Ppk
The difference between Cp and Pp lies in the method for calculating s, and whether or not the existence of
rational subgroups is considered. Calculating Ppk presents similarities with the calculation of Cpk. The
capability rate for Ppk is calculated using the formula:
Once more it becomes clear that this estimate is able to diagnose decentralization problems, aside from
the quantity of process variation. Following the tendencies detected in Cpk, notice that the Pp value (0.76)
is higher than the Ppk value (0.56), due to the fact that the rate of discordance with the LSL is higher.
Because the calculation of the standard deviation is not related to rational subgroups, the standard
deviation is higher, resulting in a Ppk (0.56) lower than the Cpk (0.60), which reveals a more negative
performance projection.
Page 26 of 29
By Prabhat Khare (prabhatkhare22659@gmail.com)
Calculating Sigma Capability
In the example above, it is possible to observe the incidence of faults caused by discordance, whether to
the upper or lower specification limits. Although flaws caused by discordance to the LSL have a greater
chance of happening, problems caused by the USL will continue to occur. When calculating Cpk and Ppk,
this is not considered, because rates are always calculated based on the more critical side of the
distribution.
In order to calculate the sigma level of this process it is necessary to estimate the Z bench. This will allow
the conversion of the data distribution to a normal and standardized distribution while adding the
probabilities of failure above the USL and below the LSL. The calculation is as follows:
(Figure 3)
Figure 3: Distribution Z
Page 27 of 29
By Prabhat Khare (prabhatkhare22659@gmail.com)
The calculation to achieve the sigma level is represented below:
Page 28 of 29
By Prabhat Khare (prabhatkhare22659@gmail.com)
Comments/ Notes:
Page 29 of 29
By Prabhat Khare (prabhatkhare22659@gmail.com)