Sustainable Food Safety Focuses on Risk Management

In an ideal world, every piece of food we eat would be free of pathogens at all times. In the real world, though, where 600 million people contract a foodborne illness every year, this just isn't the case. In fact, it's impossible-microbes are just too ubiquitous, and food systems are too complex to eliminate them entirely.

For this reason, scientists like Martin Wiedmann, DVM, Ph.D., the Gellert Family Professor in Food Safety at the Cornell University College of Agriculture and Life Sciences, advocate moving away from trying to obliterate microbial risk altogether and, instead, moving toward using data-driven strategies that contextualize the risk to guide action. Doing so can help food regulators and companies prioritize where to put their resources, leading to food safety systems that are less Sisyphean and more sustainable.

"There is not unlimited money in food safety, nor is there anywhere else," Wiedmann said during a scientific session at ASM Microbe 2025. "So, we really have to be careful and use our science to help everyone involved, to invest money where there's the biggest impact and where we can drive continuous improvement."

Why Zero Microbial Risk Does Not Exist

Life is inherently risky. While we accept risk in some areas of our lives (e.g., driving a car), we are less accepting in other areas. Generally, food falls in the latter category. It'd be nice if food systems could achieve absolute safety, in which food is guaranteed to be 100% devoid of harmful agents. But there are several reasons why this isn't feasible.

For one, the world is jam-packed with microbes. They are in the environment our food comes from, the machinery used to process food and the people working with and around food. Some of these organisms aren't a problem, but some, like Listeria monocytogenes and Salmonella, can cause serious illness. With so many microbes everywhere all the time, the ability to mitigate any chances of them getting on or into food is impossible. "Transmission is complex," Wiedmann noted, and contamination can occur "all the way from primary production to the consumer."

There are methods to reduce the number of microbes in certain food products (e.g., pasteurization). But such treatments are keep the concentration microbes below a given threshold (i.e., a particular CFU/g of product), not remove them completely. Moreover, some food, like yogurt or cheese, are what they are because of microbes. In such cases, aiming for a complete lack of microbes is not just unachievable, but undesirable.

We're also limited in our detection capabilities. Testing food for microbial growth is like peering through a straw and trying to paint a picture of the world; the samples represent a puny percentage of the billions of pounds of food produced and consumed. For example, one might sample a few granola bars out of a batch of thousands. If those are negative for, say, Salmonella, it doesn't mean that every single bar produced now, or in the past or future, is, was or will be free of the organism. "If I make a million packages a day, 1 negative test hardly equals zero risk," Wiedmann said. "But it gives us the illusion of zero risk."

Embracing a Risk-Based Approach to Food Safety

Achieving zero risk may be unattainable, but the concept still shapes food safety through "zero-tolerance" policies, in which the mere presence of a hazard (pathogen) can prompt regulatory action. The U.S. takes this so-called "hazard-based" approach when it comes to L. monocytogenes in ready-to-eat foods (e.g., lunchmeat); any positive test for the bacterium leads to a recall of the entire batch of a product, no matter if 1 bacterial cell was detected in the sample or a million.

However, this blanket approach has its downsides, including the potential to increase food waste and cost and decrease food availability. It's a black and white strategy for a problem that is colored with infinite shades of gray. This is why risk-based approaches-which focus on preventing the most serious health risks by categorizing hazards based on their likelihood to occur and potential impacts-are particularly attractive.

Promoted by scientists like Wiedmann and increasingly adopted by the food industry at large, risk-based food safety weighs all the factors that influence how problematic a pathogen is in a given food context-such as its prevalence, who is consuming the food and how the food will be eaten/prepared-to tailor interventions. The approach hinges on the recognition that not all risks in food, as in life, are the same.

An example: the presence of L. monocytogenes within a processing plant is a hazard. "However, the level of risk associated with that Listeria can differ substantially," Weidmann said. If the organism lives on a surface covered in condensation that drips onto food, the risk associated with L. monocytogenes is high. But "if I have that same Listeria sitting at the loading dock, where I have multiple procedures in place to prevent it from even getting close to food, the level of risk associated with the hazard is extremely low." A hazard-based strategy means trying to rid the whole plant of L. monocytogenes. A risk-based strategy, instead, determines where the biggest risk is (i.e., surfaces that contact food) and focuses mitigation efforts on those areas to achieve more practical and sustainable solutions.

Risk Assessments: The Bread and Butter of Risk-Based Strategies

But how does one go about building a risk-based approach? Two words: risk assessments. Simply put, "we take various numbers, we put them together and assess [microbial] transmission in different environments to come up with a risk," Wiedmann said. Data about a food-how it's grown, transported, processed and more-are fed into mathematical models. The models estimate the likelihood of contamination and the severity of the consequences; they can also evaluate how effective different preventive tactics are at minimizing risk. Risk managers use the results to make decisions about controlling or reducing risk.

Risk assessments can help identify the foods that deserve particular attention. They may examine how much risk a food product poses to a population (i.e., the total number of potential cases of illness), as well as the risk per serving (i.e., how likely an individual is to get sick from eating the food). Factors like microbial growth also come into play. One bacterial cell in a food likely poses a small risk (depending on the pathogen). But if that cell multiplies into the millions? That's a big risk. A company may use risk assessments to determine that a certain number of bacteria are OK, so long as that doesn't grow to levels that lead to a higher risk of illness; their focus, then, is on preventing growth.

To that end, where to focus mitigation efforts is a key use case for risk assessments. Which subtypes of Salmonella should we vaccinate poultry against to reduce human illness? Should romaine lettuce farms focus more on wildlife intrusions or irrigation water to limit contamination by Escherichia coli O157:H7? (Analyses by Wiedmann's lab suggest the latter is a better bet). Does it make more sense to spend money on testing produce for contamination or reducing the concentration of microbes on a product?

Whatever the question, they all have 1 common variable: money. According to Wiedmann, funding allocation is a crucial factor to consider when encouraging the adoption of risk-based approaches. Food production is, after all, a business. "I propose that we assess risks associated with microbes, not just for public health, but also for business impacts," he shared. "It doesn't mean business impact is more important than public health, but you need to understand all of them to make better decisions."

Negotiating the Future of Food Safety

Food safety is ultimately about balancing trade-offs. Deciding on those trade-offs is tricky, though. Food systems are complicated, involving an extensive list of stakeholders spanning farms, factories, retailers and consumers, each with their own values, priorities and responsibilities. With that in mind, the future of food safety may depend not just on analyzing risks, but on collaboratively negotiating them.

"[In] your traditional risk analysis framework, you do a risk assessment and then a risk manager decides what to do. Risk negotiation involves the people affected from the very beginning and tries to get them to decide what is acceptable and what's not," Wiedmann explained. Engaging stakeholders from different sectors and disciplines-from farm to fork-ensures that diverse values and priorities shape practical, inclusive solutions. While this approach presents its own challenges, he noted that technological tools, such as AI, could support the process. Specifically, AI could parse through competing stakeholder needs and begin negotiations without involving people. The output would provide a blueprint for negotiation when people do enter the picture, giving participants a shared starting point.

As Wiedmann sees it, "models help us to assess what risks we have [and] to make better decisions, but then we need to engage in that part of 'how do we decide acceptable levels of risk?' And [I think] risk negotiation is a key pathway."


Research in this article was presented at ASM Microbe, the annual meeting of the American Society for Microbiology, held June 19-23, 2025, in Los Angeles. Want to help shape the 2026 program?

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.