Quality Control and the Life of Data - Part 2: Second Life

Steve Wise
By Steve Wise | July 21, 2020
Vice President of Statistical Methods

Fact checked by Stephen O'Reilly

In the previous Life of Data blog (part one), we talked about using data in the quick service restaurant industry to make decisions in the moment. Does the beef patty need to be cooked a little longer? Is the sanitizer strength in compliance? Did the pork roast cook-chill pass through the danger zone fast enough? There are a host of real-time decisions that need to be made by consulting the most recent collected data.
 
However, even after a data value has “served its purpose,” that value should not be thrown away. You see, multiple data values taken from a process over several days, will expose details of your operations that could never be seen by looking at the data as one-by-one collection tasks.
 
As I mentioned in the first blog, first life data helps you know what things need to be done to ensure real-time operational efficiencies and effectiveness. Specifically, first life data answers the question, “Do I need to do anything right now?”
 
Second life data, which I’ll discuss in this blog, operates in the realm of, “What do I need to do today so my operations run better tomorrow?” Let’s venture forth into the second life of data…
Second Life of Data

Data Streams

Because there is so much data being generated at restaurants, we need to organize the values into unique logical streams. A stream is a series of data that share the same attributes. The data in Table 1 below came from the walk-in cooler at Miami Store 1963. This is a single stream of data because all the values came from the same cooler from the same store. Six temperature readings were recorded each day over a 28-day period.
Walk-In Temperatures
Table 1: Walk-in cooler temperature readings from Miami Store 1963. Red cells are readings that exceeded the 40° F upper limit.
 

Goldmining

Panning for Gold NuggetsWe now have a stream of data, so let’s go mining for gold. Let’s see if anything valuable is hiding in the data.

Nugget 1 – Common Language

The first insight one sees in the table are the values that exceeded the standard. There are 168 values in the table, and two of them were not meeting the standard. One could say that the February compliance for this walk-in was 98.8%. Thinking in terms of compliance is a great way to distill mountains of data into a common metric that can be used to describe any stream of data that is compared to a standard.
 

Nugget 2 – Early Warning Detection

Every piece of equipment has its own personality. Plotting each data value on a time-ordered chart will help you visualize those personalities, but more importantly, will feed statistical analysis engines that can detect potential problems before they become costly incidents. Below is a control chart from InfinityQS quality intelligence software.
Control Chart
Figure 1: Walk-in cooler temperature values plotted over time. Upward trend is evident long before the temperature standard was violated.

Seeing data on a chart, rather than on a spreadsheet, makes all the difference in the world. The plot points on Figure 1 are from the Table 1 data. Starting at point #41 an upward trend is forming. Notice that point #47 is blue and the blue continues to #50. These are the early warning signals that could have warned the manager that something is going wrong with the walk-in several hours (16 hours to be exact) before the temperature exceeded 40° F.
 
Statistics is the science of predicting the future. Wouldn’t it be nice to always have a 16-hour heads-up before disaster strikes? Heck, even a 30-minute warning is better than a surprise. Software can be used to provide real-time early warning signals across all data streams at the restaurant and send notifications to the person(s) in charge whenever any of their processes begin to say, “Hey! Something is amiss! Look at me!”
 

Nugget 3 – How Capable are My Processes?

Before any early warning notifications ever get sent, one can run capability analyses across all streams to assess which ones are most likely to cause problems. Figure 2 below is a capability analysis of the walk-in cooler. (Again, InfinityQS quality intelligence software at work—this time a simple histogram.)
Histogram
Figure 2: Histogram of walk-in cooler temperatures. Notice that the entire bell-shaped curve is shifted towards the upper limit.

The Data Summary in Figure 2 (upper left) is reporting that the average (mean) temperature of the cooler is 37.9° F. This is an indicator that the thermostat on that cooler is probably set to 38° F. Even though this setting is within requirements, this is not a safe place to set the temperature. Because of natural fluctuations in the temperature, setting to 38° F is too close to the upper limit for this cooler. Turning down the thermostat to 37° F would be a much better set point. Knowing the capability analysis of all equipment will help managers optimize set points across all equipment.
QSR Kitchen

Nugget 4 – Questions I Never Knew to Ask

This next visualization of the cooler data is a great comparative analysis tool. Figure 3 below—this time a Box & Whisker chart from our software—shows a weekly comparison of the walk-in cooler temperatures.
 
Figure 3 illustrates the weekly distribution of temperature values compared to the Upper Specification Limit (USL) and the Lower Specification Limit (LSL). This graph is called a Box & Whisker chart because the boxes represent where the middle 50% of the data reside and the whiskers represent the outer 25% on each side of the box.
Box & Whisker
Figure 3: Weekly comparisons of walk-in cooler temperature data.

Notice that all the data from Week 27 are below the target (“TAR” located in the center of the chart). This might indicate that someone has lowered the cooler’s thermostat. If it’s lower than necessary, one could save money by turning up the thermostat to the optimal set point. Also notice that some data from Week 24 exceeds the USL. We can drill into Week 24 by clicking the corresponding [+] button on the left.
 

In Closing

You can see that by analyzing the data you originally collected (rather than storing and never looking at it, or worse, tossing it out), you can garner valuable quality intelligence. The various charts available within our software can help you quickly visualize the data and draw conclusions.
 
InfinityQS quality intelligence solutions, Enact® and ProFicient, are powerful quality allies; by taking advantage of their robust analysis engines and analysis tools, you can learn from your data and make transformative decisions to improve your operations.


Read the other blogs in this series: Take advantage of the technology at your fingertips today: contact one of our account managers (1.800.772.7978 or via our website) for more information.
 

 

InfinityQS Fact Checking Standards

InfinityQS is committed to delivering content that adheres to the highest editorial standards for objective analysis, accuracy, and sourcing.

  • We have a zero-tolerance policy regarding any level of plagiarism or malicious intent from our writers and contributors.
  • All referenced articles, research, and studies must be from reputable publications, relevant organizations, or government agencies.
  • Where possible, studies, quotes, and statistics used in a blog article contain a reference to the original source. The article must also clearly indicate why any statistics presented are relevant.
  • We confirm the accuracy of all original insights, whether our opinion, a source’s comment, or a third-party source so as not to perpetuate myth or false statements.

 

SUBSCRIBE TO OUR BLOG

Never miss a post. Sign up to receive a weekly roundup of the latest Quality Check blogs.

Archives