February 8, 2011
50 Years of Quality: SPC Unleashed (Quality Magazine)
View article online
by Gillian Campbell
January 26, 2011
Since the introduction of computers in the manufacturing process, the capabilities of SPC have grown.
Calipers, gages and the like can be used to measure a dimension. Statistical process control (SPC) techniques provide the tools to measure the performance of an operation and express it in numbers. By looking at the numbers, quality engineers can determine whether the operation is running smoothly or whether it needs to be adjusted.
The techniques of SPC employ basic probability laws and statistical methods to develop models for the behavior of the variations witnessed in the quality characteristics of manufactured goods during production. Through the statistical study of these patterns of variation, it is possible to attribute certain types of variation to certain types of fault sources and thereby develop a knowledge and understanding of the types of corrective actions that may be required.
But where did it all begin? Walter A. Shewhart is credited with being the father of statistical quality control.
Walter A. Shewhart
In 1918, Shewhart joined the inspection engineering department of the Western Electric Co. Western Electric manufactured telephone hardware for Bell Telephone. Engineers at Bell Telephone had been working to improve the reliability of their transmissions systems. They needed to reduce the frequency of failures and repairs to their amplifiers, connectors and other equipment, which were buried underground. Bell Telephone had already realized that reducing variation in manufacturing processes would have a positive impact on repair costs. At the same time, the company determined that continual adjustments in process parameters reacting to nonconformances resulted in increased variation and a degradation of quality.
Bell Telephone's discoveries in product variation resulted in an inspection program, ensuring specification and quality standards to avoid sending defective products to the customer. Even though this program was somewhat effective, it was costly to inspect and sort the finished goods.
By 1924, Shewhart determined the problem of variability in terms of assignable cause and chance cause. In May 1924, Shewhart prepared a memo of less than one page and forwarded it to his manager, George Edwards. About one-third of the page was devoted to a simple diagram that today is recognized as a control chart. This memo set forth the essential principles and considerations that became known as process quality control.
Shewhart's principle was that bringing a process into a state of statistical control would allow for the distinction between assignable and chance cause variations. By keeping the process in control, it would be possible to predict future output and to economically manage processes. This was the beginning of the modern scientific study of process control.
Software Enters the Picture
SPC alone cannot improve quality but statistics can point out where the problem lies. Statistics can be used to summarize past events, and by analyzing and understanding those events, predictions can be made about the future.
But performing those calculations and drawing charts by hand is time-consuming. As access to computers in the manufacturing environment became more widespread, products that automated statistical analysis tasks were introduced to the market. While they provided the tools to analyze the process, the systems were often difficult to use. "They required tedious data entry and a Ph.D. statistician's knowledge to configure the software and analyze the data—a combination of tasks and abilities not easily married," Frank Tappen wrote in a 2006 Quality article. "Furthermore, obtaining interpretable results was a process completed long after the production run being analyzed had finished. This delay complicated the task of addressing process problems that continued to change—and probably resulted in additional process variation—in the interim."
Today's software eliminates some of yesterday's headaches. Many timesaving features have been added to advance SPC software usability. These software systems leverage other technologies such as automation standards and Internet access to simplify and integrate tasks, making administration, data collection, statistical analysis and reporting easier for today's technicians.
With automated SPC, data enters the system and immediately appears on the control chart. The computer finds the average, ranges or sigma and instantly draws the results of a sample on the chart. It calculates control limits and alerts the operator of out-of-control conditions for corrective action in real-time. This instant access to process data means an out-of-control process can be detected immediately, thereby saving time and money.
With the advent of the Internet and Web-based solutions, no longer did an engineer even need to be in the same plant, much less the same country, to be alerted to an out-of-control process.
Just a few short years ago, Software as a Service (SaaS) SPC emerged to deliver quality intelligence for an entire enterprise and supply chain via the Internet through secure servers.
The application allows facilities and corporate offices to share data in real-time. In addition, supplier data and analysis can be viewed. Permissions and security features ensure that suppliers cannot see each other's data. Supply chain managers and executives can evaluate incoming materials prior to delivery, and strategically select suppliers based on the quality intelligence provided.
Over the years SPC software has matured in functionality, making statistical analysis quicker, more powerful and more flexible. SPC software will continue to develop and as technologies evolve, so too will the role of SPC.
Gillian is Editor for Quality magazine. You can reach her at firstname.lastname@example.org.