For many years, the concept of “big data” has been framed as a key to help resolve various challenges and drive improvements across many business segments. The potential of big data, however, can sometimes get lost when applied to food safety. With minimal food safety “positive” testing results from which to mine data and glean insights into potential environmental or product contamination, the food industry must evolve to take the “small data” that is gathered from incidents and augment it with expertise in microbiology and food safety to solve complex challenges. This will unlock actionable insights for smarter, more dynamic risk assessment in food safety and quality control.
This approach, called “augmented diagnostics,” is rigorously grounded in science and delivers data reporting and test results, as well as in-depth insights to help food industry leaders make better decisions that will drive efficiency while also improving public health.
For augmented diagnostics to succeed in solving food safety and quality challenges, an organization must be willing to invest in two key areas: advanced tools to gather enhanced data, information, and expertise in data science, and microbiology and food processing to analyze the data and provide recommendations for the best path forward.
With so many options for diagnostic testing solutions available in the market, finding the right partner that indexes highly against both of these areas can seem deceptively straightforward. In a sea of options, it’s important to understand not just what types of technology are available, but also finding the right tool to address the specific goals or challenges an organization is trying to solve, while ensuring the expertise of the selected lab partner fits both current and future needs.
Leverage the Right Data
Traditionally, microbiology testing in food has been reactive; results and data captured from a point in time are returned one to three days following sample collection. Organizations have tried to move to a proactive model for risk anticipation, seeking to apply big data methodology to food safety problems. As discussed above, big data requires significant inputs, and the finite “positive case” data in food safety is a hurdle for effective application of this method; however, this is just one limitation of employing a big data approach when it comes to food safety.
In addition to requiring an abundance of data for successful implementation, this approach is most successful when employed in structured and static environments. The challenge in food safety is that a food processing facility is not a simple, static environment. Processes are complex, with a lot of moving parts, and there are a high number of ever-
changing variables that can affect food safety and quality, such as environmental issues, compromised raw materials, or contamination in the process. Big data algorithms will only return broad stroke results to answer large-scale questions. There is no substitute for subject matter expertise and a diagnostic partner to ensure insights are tailored for each specific circumstance. Simply put, there is no big, off-the-shelf solution to ensure food safety and quality.
Unlock New Insights
While data can be collected across food processing—from raw materials to end-user consumption—positive testing results are the most important anchor points that drive actionable insights. Now, more than ever, molecular diagnostic tools—both new and familiar tools being applied to food safety—are giving organizations the ability to truly focus on specific areas and understand their specific “small data” for the first time. Two examples of molecular diagnostic tools that are currently leveraged for organizations to focus on this small data include whole genome sequencing (WGS) and metagenomics.
WGS: Well known across the diagnostic testing industry, WGS is gaining traction as more organizations leverage this tool to provide more in-depth information about specific contaminations. One barrier to wider adoption of this technology in the food safety sector is the in-depth data that the application returns, which can cause hesitancy due to the fact that data is potentially discoverable and may lead to unattended consequences for the organization.
ACCESS THE FULL VERSION OF THIS ARTICLE
To view this article and gain unlimited access to premium content on the FQ&S website, register for your FREE account. Build your profile and create a personalized experience today! Sign up is easy!
GET STARTED
Already have an account? LOGIN