November 7, 2019
Net Content Control Blog Series Part 3 – Plant Management and Executives
In the first two blogs in this series, we discussed the importance of net content control to many industries, most notably the Food & Beverage industry, and the operators’ importance to data collection and their role in overcoming the challenges of net content control. As mentioned, achieving optimal net content control is a hidden source of untapped profits for many Food & Beverage manufacturers, as well as a way of avoiding unnecessary operational and business risks. InfinityQS’ Quality Intelligence platform, Enact®
, can help you overcome the challenges of net content control.
For large manufacturers with multiple packing or filling lines, the ability to compare and monitor net content performance across multiple dimensions enables them to effectively prioritize continuous improvement
initiatives—and to direct their capital investment decisions to where it will have the most return in performance improvements and reduction in risk:
- Which filling lines are performing better or worse than others?
- Are there operators, shifts, or crews that are not as effective as others at controlling their processes, and do they require additional training?
- Which products or product groups have greater unpredictability and variability in net content? Is a particular piece of machinery, type of equipment, or equipment from a particular supplier failing more than others?
- Is a specific part of the overall packaging process underperforming other process areas?
In this blog, we’ll look at these questions and more...from a management perspective.
Striving for Continuous Improvement
Identifying areas of weakness and performance bottlenecks is only one side of the quality coin. By asking questions like those listed above, you can be highlight areas performing particularly well and set benchmarks for all areas to achieve. Deciding on which questions to ask, however, is often the easy part.
Collecting and analyzing data to answer questions of performance (or lack thereof) is often difficult. To answer those questions efficiently—and to streamline your continuous improvement programs—we need access to real-time production data and information in a standard structure and format that can easily be rolled up (aggregated) and compared
Arriving at answers also requires analysis tools to quickly turn that operational insight into actionable intelligence
. When information resides in disparate locations, in different forms—or worse, on paper only—the overhead involved with collecting and analyzing data makes these initiatives economically unfeasible (if not impossible).
At a senior management or operational executive level, overall net content performance and compliance reports are often prepared manually on a weekly, monthly, or even quarterly basis. Not only are net content reports time consuming (and therefore costly) to produce, but underperformance and compliance risks have likely occurred during the elapsed time.
Executives need access to real-time performance data
presented in a way that is meaningful and intuitive—enabling them to make quick assessments of issues and potential issues, respond quickly, and easily see results.
In addition to operational performance monitoring, deciding when and where to invest in new equipment is a significant consideration. It may be that improvements in net content performance at a process level are limited by the physical capability of the existing, perhaps outdated, equipment. The ability to identify equipment with inherent limitations—and to evaluate the cost of overfill and the impact of compliance risks as a result—may improve cost analysis and help upper management to effectively target capital investment initiatives that will achieve the greatest results.
Enterprise Analytics and Grading
When manufacturers need to analyze and compare operational performance across multiple dimensions—such as products, process, lines, shifts, plants, or regions—two factors become critical: standardization and centralization
The Joys of Standardization
Standardization centers around how
data collections are performed, the format
of collected data, and consistency
of meta-data (a set of data that describes and gives information about the collected data). Without standardization, it can be difficult to accurately and efficiently analyze comparative data.
The Power of Centralization
If the data needed to conduct analysis across your enterprise live in disparate or localized (siloed) systems—in a variety of formats—a significant effort is required to integrate and transform those data in a central repository
. This may require specialized IT knowledge. Manufacturers soon realize the cost and complexity of such initiatives, and therefore such efforts are rarely implemented with success. The end result is that many manufacturers continue to lack enterprise analysis capabilities—and instead rely solely on individual line-based performance
. In other words, less-than-reliable decisions are made based on incomplete information (I call that guessing).
Enact includes a unified, cloud-based data repository
that ensures all data entered into the system—manually, semi-automatically, or automatically collected; and regardless of size, complexity, or plant location—is stored in a single, centrally-located repository, making it immediately available for enterprise analysis
Standardization is a core element of the Enact architecture. When a manufacturer first deploys Enact, they configure an enterprise hierarchy that mirrors the physical and logical hierarchy of their actual manufacturing operation
. This hierarchy starts at the corporate level and reaches down to individual processes, machines, and sub-processes—ensuring that all Enact configurations and data collections are assigned to the correct hierarchy level.
This enables data analysis to be performed at any organizational level—such as across a particular plant—and it ensures that Enact users have restricted visibility and user rights exclusive to their area of responsibility. This is an important security consideration.
The Enact Process Model
Process models are an innovative and powerful feature, unique to Enact, which further support standardization. Process models are logical visual representations of a physical manufacturing process
. While a process model can be used in isolation to represent a standalone process, they may also be linked to other input and output process models. For example, a soda mixing process, a PET bottle blowing process, a label printing process, and a Cap receiving hopper process may all be inputs into a “bottle filling” process model. The output of that process model would be a final filled, labelled, and capped soda bottle that then becomes the input of another process (for instance, a carton-packing process).
Each process model is also used to define the data collections necessary to be performed as part of that process, such as fill level check, label checks, and cap torque tests. These standardized process models—along with associated data collection configurations—can then be re-used across all manufacturing facilities with the same processes
Any change made to a data collection or quality check requirement is immediately implemented across all processes using that single process model. This standardized approach to data collection pays dividends by enabling the analysis and comparison of data across the enterprise—minus the need for any form of data transformation or preparation.
With all data stored in a centralized and standardized data repository—and with built-in data visualization capabilities—enterprise analysis and comparison become truly effortless.
A Box & Whisker plot (below left) can be used to compare performance across any dimension, such as by shift, product, line, plant, process, region, and across time periods. This enables senior management and continuous improvement teams to immediately identify the source of greater risk, pinpoint performance bottlenecks, or home in on high performing areas that can be used as benchmarks for other areas.
Pareto charts (above right) are used by maintenance engineers to find equipment issues and identify root causes, enabling them to direct maintenance efforts to areas that will provide the greatest return.
The Trouble with Standard Metrics
The Enact data aggregation engine provides automatic and efficient updates to aggregate KPIs at predetermined intervals, and across very large data sets. These aggregate KPIs include statistical metrics such as Cpk, Ppk, Mean, and Standard Deviation
, as well as net content-specific KPIs such as Giveaway % and Overfill% LSC (Label Stated Content)
. Aggregate KPI data can be viewed on dedicated dashboards using a variety of built-in visualizations.
One of the major challenges with enterprise-level analysis comparison lies in the fact that traditional metrics—such as Cpk, Ppk, or PPM—each tell one piece of your company’s process performance story. But trying to fit an entire site’s performance into a single number yields incomplete information. For example, suppose two sites have the same Ppk value? Are they really performing exactly the same? What if one site has a Ppk of 1.3 and the other has a Ppk of 1.4? How meaningful is that difference?
The answers depend on how the sites stack up to each other. Are production rates similar at each site, or is a single line or product masking a problem? Are all packaging lines performing with similar consistency, or are some performing better than others? Do the sites manufacture the same products and same mix of products? Does one site have a different or older infrastructure? Do both sites have operators with a similar experience level and turnover rate? Do both sites run a similar shift schedule?
Each of these factors will affect a metric such as Ppk in a different way. Knowing how a site performs doesn’t tell you if that site is performing to its full potential. If you have a site that has poor performance, who should address that problem? A site supervisor? A Six Sigma Black Belt? An equipment maintenance expert? An equipment vendor? That’s not to say these metrics don’t add value—they just don’t tell the whole story. Enter Enact grading
, stage right…
Enact Takes it to the Next Level with Enterprise Grading
The unique enterprise grading capability of Enact assists manufacturers in solving these problems. Grading provides a summary analysis
of individual data streams—the data provided by the unique combination of a single feature measured for a single part running on a single process—that can be aggregated to provide a grade for each critical feature of an entire process, part, or site. This gives you the benefit of rolled-up comparisons without losing the granularity of the raw data.
Enact grading handles all the calculations automatically, providing you with a simple letter-number combination.
For example, A3 or B1—that represents both expected and potential yield. Together, these yield metrics reflect performance in a way that enables prioritization:
- Stream potential (A, B, C)
The letters A, B, and C in the grade correspond to a high, moderate, or low stream potential. Stream potential represents the optimal yield of which a process is capable under the current level of variability—presuming that the process is absolutely on target.
- Stream performance (1, 2, 3)
The numbers 1, 2, and 3 correspond to a high, moderate, or low stream performance. Stream performance is a ratio of the stream’s expected yield to its potential yield (i.e. Performance = Expected Yield/Potential Yield).
Enact grading enables manufacturers to view a single 3 x 3 grading for the entire organization or for a particular site—for example, providing an executive summary of overall performance. Clicking a single cell within the matrix drills down to more detailed analysis of that grading. Drilling into the data provides the contextual information required to understand where performance bottlenecks or compliance risks are more prevalent and enables the investigation and corrective actions to be directed to where they are most urgently required.
Well, there you have it, the pitfalls and challenges of net content control in the Food & Beverage industry, and how InfinityQS’ Quality Intelligence platform, Enact®
, can help you overcome them. Achieving optimal net content control is a commitment to continuous improvement in your operations. But because it’s a hidden source of untapped profits for many manufacturers, it’s worth striving for. InfinityQS expertise and quality management experience can you get there.
If net content control is a major concern for your organization, you can learn how Enact can help you address those challenges in our comprehensive use case
Read the first two blogs in this series:
Take advantage of the technology at your fingertips today: contact one of our account managers (1.800.772.7978 or via our website
) for more information about InfinityQS products and services.