Quality Control and the Life of Data – Part 5: Protecting Your Brand

Steve Wise
By Steve Wise | August 27, 2020
Vice President of Statistical Methods

Fact checked by Stephen O'Reilly

During the previous four blogs in this series, we’ve discussed how data takes on different lives or, more accurately, how data provides value that goes beyond the initial reason for collecting the data in the first place.
The manufacturing marketplace is becoming more competitive with every passing day. Organizations must be able to rely on the strength of their brand and reputation to attract and retain customers, supply chain business partners, and, in some cases, investors. For many manufacturers, though, protecting your brand and reputation is a somewhat elusive goal—until an incident or crisis occurs.
In this, our final blog of the series, we'll discuss how you can strengthen your quick service restaurant (QSR) brand's resilience in the face of a crisis through proper food safety data analysis.
Remember that no matter where a data value resides in its life cycle, that data—data that you collect all the time—can be used to protect your brand.
Focus on Food Safety

Attributes of Good Data

Because we are focusing on food safety, we already know that most data will be associated with temperature and time. To ensure that we can attain the greatest value from that data, here are some attributes to consider:
  1. There must be known standards from which to judge the data. Without standards, data is not actionable. Knowing the water temperature in the first rinse sink is not useful unless the rinse water temperature is tied to a standard. For example, the temperature needs to be at least 105° F.
  2. Data are collected on a regular basis. Each data collection result represents only one snapshot in time. To understand the personalities of data streams that contribute to food safety one needs to review constantly updated snapshots. Think of a single data collection as a single frame in an animated movie. When we have a filmstrip of the data stream, we better understand the past and present, and have reasonable expectations of how that data stream should behave in the future. Knowing that the typical temperature profile of a cooling unit is a sawtooth pattern, rather than a straight line, is just one example contrasting snapshots with filmstrips:Sample Temperature Profile
          Typical sawtooth pattern of a cooling unit's temperature profile over time.
  1. Data has a customer. Someone needs to make a decision based on the result. If the data were just collected, the results need to answer someone’s “do something” or “do nothing” question. Customers can be a person in charge, an auditor or a patron—really anyone who is affected by that data is a potential customer.
  2. Data has an owner. Someone needs to be responsible for collecting the data as well as the authority to execute necessary dispositions that could arise from the results. If the kitchen ambient temperature is too warm, the “data owner” needs to have the authority and wherewithal to make changes to the HVAC’s kitchen temperature controller.
  3. The factors that affect the data value are controllable. Using the kitchen temperature again, we assumed there was an adjustable thermostat to bring the temperature down. If there is no vented air conditioning in the kitchen, the disposition options get a little more interesting. Without the ability to control the factors, collected data results are just measurements of present state conditions with no way to make course corrections.
Who knew there needed to be so much thought given to a number being written down on a form? Yes, the act of collecting the data needs to be quick and simple, but these five points need to be considered to realize the value—the full potential—that is possible through collecting that data.

Protecting Your Brand

The average day in the life of collecting data at your restaurant can be mundane. As a matter of fact, we all hope it’s mundane! When all values tell the person in charge that no actions are required—that all systems and processes are meeting the requirements—then life is good. But eventually, something unfortunate will happen. And when that event occurs, it’s the data on hand that will determine the extent of the fallout.

Scenario: Walk-In Cooler Goes Down at 1:00 AM

You open your store at 7:00 AM and begin the morning checklist routine. You notice that the walk-in cooler is not working.
Scenario: Walk-In Cooler Goes Down at 1:00 AM 
There are three ways in which this scenario plays out:
  • When there is no data to indicate when the cooler went down, one must assume worst case…the walk-in went down just after the closing checks were completed at 10:30 PM. Let’s also assume that 4:00 PM was last time the walk-in temp was recorded on any official form. The closing manager may have seen that the cooler was functioning during closing, but only the 4:00 PM documented data value can be used to determine the disposition. Under these conditions, the corrective action should be to destroy all perishable items in the walk-in.
  • The 15-minute interval temperature data logger shows that the cooler temp started trending upwards just after 1:00 AM. The cooler went into the danger zone at 5:15 AM. Knowing that the food was subject to danger-zone temperatures for less than two hours, some, if not all, of the food could possibly be salvaged by quickly moving the items into backup refrigeration units until the walk-in can be fixed. This is still a scramble, but not a complete loss.
  • The data logger sends an alert to the person in charge when the temperature trend pattern is detected. Assuming the pattern is triggered after the sixth consecutive rising temperature is recorded, this would generate the alert between 2:30 AM and 3:00 AM. Because the data were being analyzed using trending rules, all the food can be saved with complete confidence that no foodborne risks were ever present.
Yes, a 3:00 AM scramble to the store is no fun, but having the data and utilizing better predictive analysis techniques clearly saved the day.
Protect Your Brand

Wrap Up

So, over the course of the last few weeks, we’ve focused on the importance of data at your QSR, how and why comparing data stream results to standards is so important to your operations, how early warning predictors can help your restaurant, and finally how data provides value that goes beyond the initial reason for collecting the data in the first place—to protect your products and your brand.
I urge you to take the time to investigate what the Enact® Digital Food Safety platform can do for your business. Don’t allow preconceived notions of the difficulties and costs of transforming your quality efforts in your QSR get in the way of making the leap to constant improvement and safety, quality excellence…and protecting your brand. Get started with your quality transformation today with Enact.
Read the other blog entries in this Life of Data series:  
Take advantage of the technology at your fingertips today: contact one of our account managers (1.800.772.7978 or via our website) for more information.


InfinityQS Fact Checking Standards

InfinityQS is committed to delivering content that adheres to the highest editorial standards for objective analysis, accuracy, and sourcing.

  • We have a zero-tolerance policy regarding any level of plagiarism or malicious intent from our writers and contributors.
  • All referenced articles, research, and studies must be from reputable publications, relevant organizations, or government agencies.
  • Where possible, studies, quotes, and statistics used in a blog article contain a reference to the original source. The article must also clearly indicate why any statistics presented are relevant.
  • We confirm the accuracy of all original insights, whether our opinion, a source’s comment, or a third-party source so as not to perpetuate myth or false statements.



Never miss a post. Sign up to receive a weekly roundup of the latest Quality Check blogs.