| Written by Mark Buzinkay

Quality is more than a technical attribute—it is a strategic imperative rooted in how well a product meets customer expectations. While traditional views emphasize conformance to measurable standards, modern quality strategies recognise that customer perception is the ultimate benchmark. Balancing subjective and objective definitions of quality is essential for operational success. In this article, we discuss the evolution of quality thinking and outline a practical, problem-driven quality strategy for building and sustaining quality in manufacturing.
quality strategy

No video selected

Select a video type in the sidebar.

Quality Strategy: Why does quality matter?

Quality is foundational for manufacturing performance but means different things to different individuals and organisations. Some argue that the common perception of quality as conformance to requirements is wrong as it ignores customer needs. In fact, quality is in the eyes of the customers. As it is neither observable nor controllable in production, we rely on substitute characteristics that are variables we can measure or attributes we can observe. Any unit that fails to meet requirements in terms of substitute characteristics is defective, but it doesn't mean that every unit that passes meets customer needs.

Quality is one of several dimensions of competitiveness and, contrary to long-held beliefs, is not necessarily achieved at the expense of performance in other dimensions. Actually, the ability to produce quality products positively affects flexibility, speed, and productivity. The approach to quality improvement in a plant should be dictated not by a philosophy but by the quality problems the plant actually faces.

 

Quality strategy: What is quality management?

Throughout the early 20th century, high-end luxury products were described as being of higher quality than their low-end, basic counterparts. It is no longer used in this sense. The quality of a product is not based on its position within a product line or the market it serves. It is strictly a function of how well it serves the customers who buy it. A luxury product can be deficient in quality, and a basic product can excel.

The most commonly accepted definition of product quality is: "Quality is the ability to meet customer needs".

This view is that quality is subjective. Quality is the agreement of reality with expectancy. Expectancy is the expectation that a product will perform as advertised. A customer buys a product in the hope that it will do what the seller said it would, and it is achieved when all its characteristics match their expected values (see also quality documentation).

On the contrary, the view of quality as product life expectancy defines quality as conformance to requirements. This definition is objective. Requirements are formal, numerical characteristics of the product that must be within specified tolerances.

High quality, according to the objective definition, may be poor quality in the subjective definition. In pastry, the proof of the cake is in the eating, but it is destructive testing, and a pastry shop cannot eat all its cakes. To check their quality, it has to use measurable substitute characteristics, also known as proxies. If its diameter is off and it contains half as much sugar as it should, then it is certainly defective. On the other hand, just because it has the right diameter and sugar content does not mean it tastes good. While conformance to requirements can save you from an expensive lawsuit, the ultimately important criterion is the customers' subjective experience.

For many manufacturers, their reputation for quality is the crown jewel, and management does whatever it takes to maintain it. To maintain and improve manufacturing quality, they need to understand customer expectations, modify the internal processes, communicate their own expectations to suppliers and help suppliers meet them.

 

What is a defect?

Terms like defect, defective, and error occur all the time in discussions of quality, and we must be clear about what we mean by them:

  • A defect is a characteristic a product unit should not have.
  • A defective is a unit of product with at least one defect.
  • An error is committed by a human accidentally doing the wrong thing.

Errors cause defects, but not all defects are the result of errors.

 

A brief history of quality

The concept of quality has evolved significantly over the centuries, shaped by advancements in technology, societal changes, and evolving business practices. In its earliest forms, quality was a deeply personal and localised concern. In ancient times, craftsmen took full responsibility for the products they created, often from start to finish. These artisans took pride in their work, and their reputations depended on the quality of their goods. There were no formal systems or metrics; quality was judged by the customer or by the master overseeing the work. This approach ensured a high level of craftsmanship, but it was inherently limited in scale.

During the Middle Ages, the notion of quality became more institutionalised through the formation of guilds. These organisations of skilled workers and artisans set standards for training, work practices, and the final products. Guilds protected both the integrity of the trade and the consumer by maintaining certain levels of quality. Members were expected to uphold the guild's reputation, and apprentices underwent years of training before being allowed to produce goods independently. While this system provided structure and accountability, it still relied heavily on manual inspection and individual responsibility.

The Industrial Revolution marked a significant turning point in the history of quality. With the advent of mechanised production and the growth of factories, the traditional artisan model became less viable. Goods were now being produced in large quantities by multiple workers, often performing specialised tasks. This shift made it harder to maintain consistent quality across all products. Initially, quality control was carried out through end-of-line inspection. Defective products were identified and removed before reaching the market, but this approach was reactive and inefficient. There was little emphasis on preventing defects during the manufacturing process.

The early 20th century brought more scientific approaches to quality. Influenced by the rise of industrial engineering, thinkers like Frederick Winslow Taylor introduced systematic methods for improving efficiency, although not necessarily focused on quality. The real breakthrough came in the 1920s when Walter A. Shewhart, working at Bell Laboratories, developed the concept of Statistical Process Control (SPC). Shewhart's control charts enabled manufacturers to monitor production processes and detect variability that could lead to defects. This marked the beginning of a more proactive approach to quality, emphasising process stability and data-driven decision-making.

After World War II, the focus on quality intensified, particularly in Japan. American experts like W. Edwards Deming and Joseph Juran were invited to help rebuild Japanese industry. They introduced new philosophies that emphasised continuous improvement, management responsibility, and employee involvement. Japanese companies readily adopted these ideas and incorporated them deeply into their corporate culture. Practices such as Kaizen, Total Quality Control, and later Total Quality Management (TQM) emerged from this environment, helping Japanese manufacturers achieve unprecedented levels of efficiency and product excellence.

By the late 20th century, quality management had become a central business concern worldwide. The introduction of the ISO 9000 series in 1987 established a globally recognised standard for quality management systems, enabling organisations to demonstrate their commitment to quality to customers and partners. Around the same time, Six Sigma gained popularity, particularly in large corporations like Motorola and General Electric. This methodology focused on reducing defects and variability through rigorous data analysis and process control.

Today, quality is viewed not just as a production concern but as a strategic asset that influences every aspect of a business. It plays a crucial role in enhancing customer satisfaction, maintaining brand reputation, and achieving a competitive advantage. New technologies such as artificial intelligence, machine learning, and the Internet of Things are transforming quality management once again. These tools enable real-time monitoring, predictive maintenance, and advanced analytics, allowing organisations to prevent problems before they occur. Quality has thus come full circle, retaining its core focus on consistency and excellence while adapting to the demands of a complex, digital world.

 

The right quality strategy

The approach to quality improvement in a factory should be dictated by the quality problems it is facing rather than blindly copying a quality improvement method. When the primary quality problem is operator errors, mistake-proofing is necessary to prevent them. But if the main problem is the inability to hold tolerances consistently, spending your resources on mistake-proofing would be lining up deck chairs while the ship is sinking. Conversely, data science is not much help in the prevention or errors that occur once or twice a year.

The right approach to quality management is tailored to specific needs, but there is a general sequence to how problems should be addressed:

  1. Process capability: If your main problem is a lack of process capability, your failure rate will typically be high, and this issue should be addressed ahead of anything else. Beyond this point, it's no longer the identification of root causes that matters most, but instead the speedy detection of problems.
  2. Problem detection time: The next order-of-magnitude improvement in quality performance comes from moving to a one-piece flow with first-in-first-out (FIFO) sequencing. The proportion of defectives traced to the operations in a cell drops by at least 50%.
  3. Human error: The effect of one-piece flow and FIFO, in turn, levels off percentages of defectives commonly on the order of 1% to 0.1% = 1,000 parts per million (ppm) defects. It is not by itself able to achieve low ppm rates. Human error still affects individual parts, and this is where mistake-proofing comes in: It is the strategy that takes you from, say, 1000 ppm to 15 ppm.

 

Quality Strategy Step 1: Ensure process capability

Low yields and high failure rates are a constant of life in high technology manufacturing. Just as SPC came out of Western Electric, the high technology of the 1920s, six sigma came out of Motorola in a similar situation in the 1980s.

This does not mean that the relevance of the approach is limited to high technology. Even in mature industries, there are processes, such as painting car bodies, with double-digit rework rates, even among the best manufacturers. Finally, processes temporarily exhibit this type of behaviour during the introduction of new products.

By "statistical methods," most quality professionals understand the combination of control charts, histograms, scatter plots, sampling methods, inspection plans, and other tools collectively known as SPC or "SQC." These methods, pioneered 100 years ago by Walter Shewhart and later promoted and enhanced by W. E. Deming, A. Wald, and K. Ishikawa, were targeted at the processes and based on the information technology of the era. The production of these charts continues today as a result of customer mandates enforced by audits; however, they play almost no role in problem-solving, even when statistical methods are required.

Since the 1930s, not only has data acquisition technology changed, but statisticians have developed many tools that take advantage of computers. Semiconductor testers gather hundreds of variables on each circuit, but even in mature industries, spindle probes in machining centres, Coordinate Measuring Machines (CMMs), or Supervisory Control and Data Acquisition (SCADA) systems can collect numerous product characteristics and feed them immediately to analysis software.

Even the electronic spreadsheets found on every engineer's desk come with a data analysis package that is far more powerful than traditional SPC. Beyond this, specialised statistical software products supporting multivariate exploratory data analysis and Design of Experiments are available for less than the cost of one engineer-week. Yet, except in special industries like semiconductors, the power of these tools remains largely untapped. The reason is not technology but a lack of skills. Not only are people who understand tools like the Fisher discriminant or fractional factorial designs of experiments not commonly found in factories, but their knowledge is of limited value unless they also understand plasma etching, the appropriate feeds and speeds for milling titanium, or whatever other process they are using. And people combining data science and process knowledge are almost nowhere to be found.

 

Quality strategy Step 2: Built-in quality

Once process capability is established, the plant faces a change in the nature of quality problems, from drifts in process parameters to infrequently occurring discrete events like tool breakage, skipped steps, or mislabeled components. The challenge at this point is to detect them quickly and react promptly. The speed takes precedence over analytical sophistication.  

The solution is to detect quality problems more quickly by identifying them immediately or shortly after the process. The way to do it is to seek to implement one-piece flow, first-in-first-out (FIFO), visual control, and in-line quality control through manual or computerised visual inspection or go/no-go gauges.

The goal is to never pass on a mistake to the next process, but these methods do not guarantee it doesn't happen. Therefore, there is also a need for problem-solving techniques that handle deviations quickly. One such method is Quick Response Quality Control (QRQC). These methods are ineffective on processes with double-digit percentages of defectives because they would make the system break down. High volume production with an unstable process requires buffers between every two operations, simply to protect the downstream operation against output fluctuations upstream. The process characterisation and stabilisation work done with data science enables one-piece flow and FIFO, which in turn enables the process to reach the next plateau.

 

Quality strategy Step 3: Mistake-proof processes

Even if all the above is implemented, humans still make mistakes. The only way to avoid it is to make it impossible to err. Poka-yoke designates small equipment retrofits that mistake-proof production processes. Where most defects are caused by human error, poka-yoke can get your reject rate from 0.5% to <1 ppm. Human error is particularly notorious in mechanical and electronics assembly operations (see also: Pick 2 light: Manual versus automated retrieval).

 

Quality strategy Step 4: Manage rework and recalls

We marvel at the ingenuity of poka-yoke devices, but we have yet to invent one for every defect opportunity, and until we do, we must rely on less powerful tools. The inevitable result is that poor quality sometimes slips through to the next process or, worse, to the customer. How we handle quality mistakes deserves a strategy on its own.

For rework, we must decide where to do it, in the line or at the end of the line, set the threshold between scrap and parts to rework, and how to manage them. Every rework is waste, but sometimes it is even done on purpose. For example, during the global chip shortage in 2021, several vehicle manufacturers produced semi-finished vehicles to stock, with the idea of reworking them when the needed chips became available again. While this is something that should be avoided, when it first happens, one needs to balance the best use of available resources. In this case, sending the workforce on paid leave was considered an even greater waste and a more risky option than building semi-finished products.

Defectives eventually and almost inevitably slip through to customers. If the defects are systematic and safety-critical, manufacturers have to issue a recall. Recalls are common in the pharmaceutical and food and beverage industries, as well as in high-technology industries such as healthcare technology, automotive, and aerospace. It is also not unusual for retail chains to recall products from low-quality suppliers of all kinds of consumer goods, such as toys. Managing recalls well can turn a negative customer experience into a positive experience.

 

FAQ – Quality Strategy

What is the difference between objective and subjective definitions of quality?

The objective definition of quality focuses on conformance to measurable requirements—such as dimensions, tolerances, or specifications—typically verified through inspection and testing. In contrast, the subjective definition is based on customer expectations and perceptions. A product may meet all technical standards yet still be perceived as low quality if it fails to satisfy the user’s needs or preferences.

Why is process capability the first step in a quality strategy?

Process capability determines whether a manufacturing process can consistently produce products within specified limits. If a process lacks this capability, no amount of inspection or mistake-proofing will yield acceptable quality. Addressing this foundation ensures that improvement efforts are not wasted on unstable or poorly understood processes.

Can quality improvements impact other areas like speed or cost?

Yes. Contrary to the belief that quality comes at the expense of efficiency, a robust quality strategy can enhance productivity, flexibility, and speed. High-quality processes reduce rework, minimise waste, and improve workflow continuity—ultimately leading to better overall performance and lower operational costs.

 

Takeaway

A successful quality strategy must be tailored to the specific challenges a plant faces, beginning with process capability and evolving through mistake-proofing and recall management. Rather than following a one-size-fits-all philosophy, manufacturers should adapt their approach to match operational realities. In the digital age, technologies like Automatic Identification and Data Capture (AIDC) play a pivotal role by enabling real-time monitoring, reducing errors, and enhancing traceability across the production chain. By integrating AIDC into their quality strategy, manufacturers gain the visibility and control needed to meet both regulatory demands and customer expectations.

Monitor Operational Performance the Right Way Whitepaper

Delve deeper into one of our core topics: Real time location systems

 

Glossary

Poka-yoke (often mistakenly called “yoka-poke”) is a Japanese term meaning "mistake-proofing." It refers to any mechanism or process design that prevents human errors from occurring or makes them immediately detectable. Common in lean manufacturing, poka-yoke devices are simple, low-cost solutions that ensure correct actions are taken during production or assembly. The concept was formalized by Shigeo Shingo at Toyota. (2)

References:

(1) Baudin, M. & Netland, T. (2023). Introduction to Manufacturing. Routledge.

(2) Shingo, S. (1986). Zero Quality Control: Source Inspection and the Poka-Yoke System. Productivity Press.




m_buzinkay

Author

Mark Buzinkay, Head of Marketing

Mark Buzinkay holds a PhD in Virtual Anthropology, a Master in Business Administration (Telecommunications Mgmt), a Master of Science in Information Management and a Master of Arts in History, Sociology and Philosophy. Mark