You are on page 1of 6

OUTPUT-FLOW MONITORING

PIECES PRODUCED
LONGEST NON STOP
NUMBER OF STOPS
AVERAGE STOP LENGTH
VALUE ADDING TIME
DOWNTIME

DOWNTIME ANALYSIS
OEE
BATCH PERFORMANCE
TRACKING

TO DO-
INDUSTRY 4.0
IOT
RICH DATA

JUST THE FACTS


 The capabilities of the Internet of Things (IoT) and Industry 4.0 combine to transform
today’s quality paradigm into data-driven “deep quality” that is better capable of
addressing latent and overt stakeholder needs. Deep quality is key to capturing value
from mass customization and building products that address real-world needs.
 The authors examine how IoT supports capturing authentic consumers and real-world
use case data to develop responsive and adaptive products. They also address how
Industry 4.0 supports small-batch manufacturing to ensure that efficient, effective
products make it into customers’ hands.

Technical affordances unlock capabilities across domains. Often, advances in one field spur
broader transformation in another. One discipline undergoing radical technical innovation is the
internet, with improvements in sensing, connectivity, inference and actuation enabling smart,
scalable and hyperconnected systems in a revolution called the Internet of Things (IoT). IoT’s
affordances support other disciplines’ reinvention.

Industry, for example, benefits from IoT’s adoption, and from parallel advances in artificial
intelligence (AI) and big data. While craftspeople once fabricated bespoke widgets and, more
recently, mass production commoditized manufactured goods, the capabilities of IoT-related
technologies uniquely enable an ongoing fourth industrial revolution. Industry 4.0 pairs smart
supply chains, demand-responsive, distributed and flexible manufacturing, and a return to lot-
size-one fabrication with modern production economics. The result is increased production
diversity, reduced costs and enhanced quality.

But is quality truly improving? For decades, quality has emphasized conformance, defect
minimization, perception enhancing and cost-saving reliability improvements. More recently,
organizations sought to build upon traditional quality through voice of the customer (VOC)
exercises and sentiment analysis. But these efforts resulted in limited success due to reporting
biases, subjectivity and knowledge gaps. Corporations made their products right, but they failed
to make the right products.

This article explores the history of quality and envisions how the capabilities of IoT and Industry
4.0 combine to transform today’s quality paradigm into data-driven “deep quality” that can better
address latent and overt stakeholder needs. Specifically, we examine how IoT supports capturing
authentic consumers and real-world use case data to facilitate the development of responsive
and adaptive products. We also consider how Industry 4.0 supports small-batch manufacturing to
ensure that performant and effective products make it into customers’ hands.

A history of quality management


Historically, quality has been associated with the “sensation of (good) properties.” It embodies
two aspects of excellence: objective quality, such as the sharpness of a blade, and perceived
quality, such as the reputation of the sword’s craftsperson. Prior to industrialization, quality was
costly and available only to elites. Upon the democratization of manufactured goods, quality no
longer reflected absolute excellence but rather relative excellence and perceived value—
functionality, features and satisfaction with respect to the price paid.

Early quality management for mass production acknowledged the importance and salability of
relative excellence. Manufacturers worked to improve tolerances (for component interchange),
finishes (to enhance look and feel) and material uniformity (for performance consistency), with
Walter Shewhart, W. Edwards Deming and Joseph Juran pioneering material, part and product
inspection in the early 20th century. By the 1940s, plan-do-check-act and statistical process
control enabled process monitoring and control to discover, correct and eliminate flaws.

The 1960s brought about proactive, design-phase quality assurance with failure mode and
effects analysis. More recently, manufacturing line and equipment sensors and numerical
controllers began closed-loop process supervision. Simultaneously, the Toyota Production
System reduced material waste and accelerated process changeover, while total quality control
and poka-yoke empowered employees to discover and resolve problems, as well as error-proof
processes. The result was improved product specification conformance and defect reduction. In
the 1980s, quality management became a holistic organizational effort in recognizing the
importance of human capabilities and employee engagement, leading to the concept of total
quality management (TQM).

As traditional quality attributes improved, a latent problem emerged: mass production removed
interaction between customers and craftspeople, meaning products often were designed as a
speculative one-size-fits-all solution stemming from data-blind processes. Worsening this issue
was the growing separation between white-collar engineers and blue-collar assemblers, which
limited interaction and siloed knowledge.

With high-conformance product sales slowing, organizations began to solicit consumer input
through observational studies, or focus groups, with initiatives such as quality function
development. In practice, however, consumers rarely know what they want (they want what they
know)—and importantly, they hold reporting biases, use imprecise language, misremember facts
and attempt to provide answers that interviewers want to hear. The resulting products frequently
failed to meet real-world needs.

Today, commoditized conformance quality fails to differentiate products. Instead, modern quality
has two critical attributes: needs-meeting, which determines the features a product must have to
satisfy consumers, and traditional reliability and defect control, which lowers costs and, along
with needs-meeting, contributes to improving brand perception and reputation.

With the proliferation of IoT and Industry 4.0, consumer devices generate copious data and
manufacturers may produce bespoke, data-informed and adaptive products addressing both
facets of modern quality. Leveraging these capabilities to simultaneously make the right product
and make the product right is the foundation of a revolution in TQM we call “deep quality.”

MORE THAN TECHNOLOGY


Quality 4.0 is more than technology. It’s a new way for quality professionals to drive quality with
digital tools and understand how to apply them within the Fourth Industrial Revolution to promote
best practices and thought leadership in their organizations. By speaking the digital language
and making the case for quality—especially in times of disruption—organizations can pivot and
adapt to rapidly shifting circumstances on their journey to achieve excellence.

Last year, Forbes Insights, in partnership with ASQExcellence (ASQE) and ASQ, examined how
quality initiatives are progressing in the digital era based on the views and experiences of 1,036
executives and quality professionals from global enterprises. The results from ASQE’s Insights
on Excellence (IoE) benchmarking data allow ASQE to bring real-world metrics and quality
insights to ASQE and ASQ members.

To deliver actionable guidance to our member communities to pursue best practices in


organizational excellence and operations, the 2020 IoE Category Report gathered data on the
ways technology is rapidly changing the workplace, the workforce and the markets that
organizations serve.

Key takeaways from the report reveal that nearly four out of five respondents (79%) agree to
some extent that their organization needs to invest in digital transformation in the next year to
meet evolving customer expectations of quality. Furthermore, while a vast majority (93%) agree
that investment in technology has improved performance against quality objectives, only a
quarter (23%) completely agree that they have seen strong return on investment to date.

By understanding the ways that digital transformation requires not only extensive change
management practices, but also strategy and planning efforts, quality leaders can leverage their
extensive expertise to direct their organizations’ efforts to pursue excellence.

To learn more about the ASQE IoE suite of research and purchase your own copy of the 2020
ASQE IoE Category Report, visit insightsonexcellence.org.

Deep quality is a ‘uniquely now’ revolution


As traditional conformance quality evolves from a delighter to a must-have product attribute,
there is an opportunity for quality to represent more than things being trouble free.

Deep quality uses real-world data in the opportunity identification, design and use phases to
improve product fit and performance. It leverages connectivity to make products responsive to
changing needs and use cases, and it draws on modern manufacturing to make cost-effective,
bespoke solutions. Deep quality also draws on recent technical advances to comprehensively
address needs-meeting and conformance.

To capture rich and unbiased data from potential customers, organizations may instrument
people, places, objects and systems using IoT, pervasive sensing and ubiquitous connectivity.
Analytics build upon advances in AI, elastically scalable platforms, smart application
programming interfaces, digital twins and the cloud to transform information from data exhaustion
into actionable insights.

These advances combine to allow for the creation of a flexible, multi-stakeholder and interactive
platform wherein real customers, product developers, external innovators, solution providers,
suppliers and producers join to co-innovate and create needs-meeting products irrespective of
physical distance, organizational barriers and traditional role silos. The result is a participatory
approach toward data-driven design, economical mass-customization and late differentiation.

By using IoT and parallel advances to observe and capture data from real use patterns as
opposed to traditional VOC surveys, we instead transition to the more descriptive and faithful
voice of big data, which communicates information from real customers about how they use
products; how those products can be built better, smarter and less expensively; and opportunities
for optimization during use. It also helps to anticipate and mitigate the consequences of likely
failures. Products in the field may report back data to allow true continuous performance
improvements rather than at traditional redesign intervals.

Using Industry 4.0 manufacturing allows for reduced production costs and lot-size-one
manufacturing, so businesses across scales may cost-effectively produce and distribute bespoke
products tailored to individuals’ use cases. Product-generated data will allow manufacturers to
optimize material use, geometries and functions to match customer needs without over- or
under-building. Elastic, distributed smart manufacturing will reduce production lead time, shorten
changeovers, stabilize demand and limit freight distance.

If the resulting products are connected, they may be adapted further to specific use cases at the
time of sale or evolve over time, transforming consumption from single-sale to an ongoing
relationship. Tesla’s in-car software, for example, regularly adds new features and adapts to
individual and aggregate user needs. Making disposable goods malleable through software
improvements grows consumer lifetime value and increases engagement, with cloud
computation allowing product capabilities to be scaled indefinitely.

The affordances of IoT and Industry 4.0 combine to ensure we not only make the product right,
but also that we make the right product—the defining characteristic of deep quality.

Implementing deep quality requires organizational change


To capture value from deep quality, organizations must emphasize leveraging deep
technologies, including IoT and Industry 4.0, to create smart, connected systems capable of
collecting—and responding to—rich data recorded from consumers with consent. Data, products
and processes will—in equal measure—generate long- and short-term value for legacy
businesses and upstarts alike.

Deep quality will have the most significant impact when integral to an organization’s philosophy
and cultivated within a supportive environment. Organizational challenges are at odds with deep
quality’s implementation, however, and these must be addressed to maximize the quality
revolution’s impact.

Traditional business structures are influenced by mass production and therefore implement rigid
managerial and design practices, silo knowledge and invest in inflexible equipment incongruous
with mass differentiation. This manifests visibly in organizations with deeply entrenched,
fragmented business tools, such as siloed product life-cycle management, enterprise resource
management, quality management and customer relationship management systems. These tools
may fail to interoperate, which limits inter-process data sharing and scalability.

Manufacturers seeking to create diverse needs-meeting products instead require a scalable and
agile organizational framework. In addition to management implementing interoperable business
tools, employees must gain an appreciation for and understanding of data and its analysis.
Traditional quality teams must expand their capabilities to better engage with customers, and
they must regularly interact with design engineers, service representatives, IT, vendors and
others outside traditional quality teams.

These same employees should acquire knowledge extending beyond traditional defect metrics
reflecting assembly error or hardware failures to better address risk management, expectation-
meeting and customer service. Smarter systems require smarter support—both proactively,
during design and manufacturing, and reactively after products are in the field—and reskilling
and training will help quality employees align corporate goals with a vision for improving quality,
driving sales through improved consumer perception.

To support this shift, management should encourage employees to put trust in data with
appropriate provenance—even if it counters long-standing assumptions—and to embrace the
ability to update a product in the field—even if that means disrupting sales by transitioning from a
single point of sale to an ongoing customer relationship.

While deep quality generates data and improves traditional quality metrics, such as defect rate,
risk management and conformance through closed-loop control, these benefits are secondary to
increased sales and improved consumer sentiment resulting from enhanced needs-meeting.
Deep quality will change organizations’ foci from defect reduction toward building products that
meet pressing, real-world needs, and adapting those products as needs evolve. This will
increase customer lifetime value and is a fundamental change in valuing product lines to
consider potential upsides and future scalability, rather than sunk cost and leveraging existing
competencies.

Management also must un-silo knowledge and skills within an organization, cultivate an
environment supportive of flexible roles and cross-discipline collaboration, and celebrate
employee edification and contributions (for example, by running organizationwide data science
training). This will require the creation of new, agile teams, technical and nontechnical roles, and
responsibilities aligned with a quality-focused mission.

A significant outcome will be improved design quality, with hyperconnectivity improving the
product development process by making it transparent, traceable and open to all stakeholders.
Rather than designing for value, which today optimizes products for perceived or reported value,
deep quality will create products designed for real value. Customer, engineer and data-driven
innovation will combine to reduce product development time, foster mass customization and
reduce costs. Long term, this will improve the consumer experience by reducing bugs,
proactively identifying and addressing new opportunities, and facilitating enduring products with
sustaining revenue.

How we measure quality also will evolve. Traditional conformance and compliance standards
and processes, such as ISO 9000, will be disrupted, with modern quality defined malleably. In
addition to expanding from conformance to needs-meeting, quality will take on other metrics
pertaining to consent, trust, privacy and security, and even data quality. System performance no
longer will be evaluated at steady state, but rather during use or in the face of unanticipated
consequences. Will data be protected or leaked in the event of a power outage? You could argue
that information stewardship will become an integral part of future quality management.

Cultivating a tech-progressive environment of cross-discipline innovation will help ensure deep


quality gains a foothold and has the most transformative impact within industry. Those players
willing to cultivate an environment that works toward holistic needs-meeting, rather than
optimizing traditional quality metrics, will be met with success and will help bring about a cross-
industry transformation, delivering high tech, high quality and low cost to the masses. Deep
quality is key to capturing value from mass customization and building enduring products that
address real-world needs.

Corporations made their products right, but they failed to make the right products.

Analytics build upon advances in AI, elastically scalable platforms, smart application
programming interfaces, digital twins and the cloud to transform information from data exhaustion
into actionable insights.

You might also like