Bacteriophage and Probiotics for Pathogen Control in Food Production Facilities
Introduction and background
In recent years the use of bacteriophage (viruses that specifically bind to, infect, and ultimately kill bacteria) and probiotics (mixtures of what are often called ‘good’ bacteria) to control the food pathogens Listeria monocytogenes, Salmonella spp. and shiga-toxin producing E. coli (STEC) in food production facilities has become increasingly commonplace. The fear of foodborne disease outbreaks and the devastating economic consequences for the companies associated with them has sparked a “we are willing to try anything” mentality among food producers in every major industry sector. This has led to the implementation of some questionable practices without a full understanding of the potential for unintended consequences. The intent of this article is not to impugn the good intentions of the companies and people involved in promoting and using these newer approaches for pathogen control. They are (almost) all motivated by the same desire to do the right thing and protect the public health. That said, in their zeal to do something, it may be that some have moved too quickly, and it is possible that at least two of these newer (newer in this particular application at least) technologies (bacteriophage and probiotics) have the potential to do more harm than good. Going further the potential for unintended harm is so great that a moratorium on their further use may be warranted until we have a much better understanding of the potential consequences of the adoption of these technologies on such a wide scale. That will require the generation of much more data than currently exists which proves their safety and efficacy to the satisfaction of the food microbiology and greater scientific community.
The problems with bacteriophage
The reasons for concern are different for each and yet they also overlap. In both cases the problem lies in how each impacts the specific detection technologies that are currently used to monitor for the same food pathogens they are applied to kill. For bacteriophage, the problem is two fold. First, these phage are propagated in the organism(s) that they are designed to target and kill. Since they only infect those specific organisms that makes a lot of sense and also happens to be a feature that makes them very desirable as antimicrobials. Once grown to a certain target level the phage must be purified away from the host strain(s) so that they can be formulated into the various products that that will be used to treat the production environment. The numbers of bacteria used and bacteriophage produced are huge, on the order of 10⁹ (10 billion) or more. The bacteria used to raise the phage are all killed when the phage burst out of the host cell as part of their life cycle. It is this lytic property that is at the heart of the anti-bacterial activity of the phage, and what makes them such effective bacteria killers. After all the bacteria are dead what is left is a “soup” of viable (still living, if you believe phage are alive that is) phage with nowhere left to go, and a mix of bacterial extra and intracellular components which includes their DNA. The next step of the process is a purification designed to remove these remaining bacteria parts and deliver a solution of pure phage. Unfortunately the purification processes are not 100% efficient and some (it turns out quite a lot, how much exactly is unknown and likely highly variable) of host bacteria DNA makes it through this process and ends up as part of the “pure phage solution” which will eventually be sprayed all over the production plant. So what? you might be asking yourself, Who cares if a little bacterial DNA gets applied along with the phage? The problem is that most modern food production facilities are using molecular detection technologies such as PCR or real time PCR as part of their routine pathogen monitoring programs. These techniques are based on the detection of food pathogen specific DNA. In the process of applying the antimicrobial phage solution they have sprayed a large quantity of detectable pathogen DNA all over the plant environment; the walls, the floors, the equipment, anywhere and everywhere they think pathogens might possibly be found. The (in hindsight) predictable result is the generation of high numbers of false positives with the next set of environmental monitoring samples sent to the lab for analysis.
False positives are a pain and no fun for anyone involve, but they are not going to be necessarily dangerous, except perhaps to the testing lab that must report these results to their clients who often react in anger or immediately point the finger at the lab with accusations of laboratory “cross contamination” or “sloppy technique.” False negative results on the other hand could very well be deadly. This brings me to the second problem with the use of bacteriophage as antimicrobials in the food production plant environment. Their use or misuse could result in the generation of false negative results in certain situations. Ultimately this is the reason why a moratorium on the practice may be needed. To see how this might happen one needs to understand that viruses (human or bacterial) don’t do very well in dry conditions. In the modern food production facility there are many dry processes, dry areas, and dry products. In fact it is a goal of modern food production to keep the environment as dry as possible. This is actually to the benefit and greater safety of our food supply. The more water that can be kept out, the lower the overall water activity (or aw), the harder it is for most microorganisms, including food pathogens, to grow. Because of this fact in many instances bacteriophage are being used to treat processes or parts of processes that are very low water activity (are dry). It is possible to question the efficacy of bacteriophage treatments on these grounds. A phage simply will not infect its host target if conditions are too dry. However, the intent of this piece is not to question the efficacy of the treatment but rather to illustrate the potential danger. At this point you still may not see where the problem is from a food safety perspective. Maybe these antimicrobials are not quite as effective in the real world as the data in the lab would suggest but what’s the harm in that? To complete the picture and see the danger requires that you understand one more important fact about all currently used food pathogen detection methods, they each require a pre-enrichment of the test sample prior to the detection step. The reasons for this are many and it would distract from the message to belabor them here. Suffice to say it is a requirement and, in addition to providing ideal conditions for any bacterial pathogens to grow and multiply, it provides a very nice “breeding” ground for any viable phage that may find their way into it. In the case of environmental monitoring for food pathogens the sample is most typically a small 4" x 2" cellulose sponge like the type shown below. The sponge is used to
collect a sample by wiping it across the surface to be sampled, typically a 2' x 2' or larger area. Any bacteria are picked up by the sponge, which is then transferred into an enrichment media designed to grow up (amplify) the target pathogen to levels detectable by whatever detection system is being employed. The enrichment media is a liquid and not only will bacteria find it attractive as a place to begin to grow and multiply, any phage present will as well. And therein lies the problem, for in order for the pathogens to grow and multiply they must be alive and viable, if phage are also present they will quickly infect and kill any that are. Thus the pathogens will never reach detectable levels even though they were there, and there in numbers high enough that if they were to find their way into a given food they could cause illness or death. If they don’t reach detectable levels the detection system will report a negative result and the plant will have no indication there is a possible problem. Of course the plant production people and the sellers of the bacteriophage will say no, the phage killed the bacteria before they got into the enrichment, and the truth is they may be right. Currently their is simply no way to know for sure. Microbiomics or other sequencing based approaches might suggest the presence of pathogen DNA in a sample which could be suggestive, but the sensitivities of these approaches are also limited and a negative result would only mean their was not enough DNA to be detected. This would still not say for sure no viable pathogens were present in the original sample.
The problem with probiotics
A newer alternative to bacteriophage for the control of pathogens in food production environments is the use of probiotcs. Depending on whom you ask probiotics work through one of two major mechanisms, competitive exclusion, and/or active inhibition. The competitive exclusion hypothesis says that by seeding the environment with “good” bacteria the “bad” bacteria will be outcompeted for nutrients and/or habitat and thus will never have the chance to establish harborage sites and become persistent inhabitants of the production facility. The second hypothesis reminds us that many microorganisms produce antibiotics and are actively antimicrobial for other microbes. In fact an antibiotic was originally defined as a substance produced by one microorganism that selectively inhibited the growth of another. Today in the age of synthetic antibiotics that convention no longer holds, but the idea that microbes can be used to kill other microbes is based in fact and well studied. The mechanism of action is not relevant to this discussion, and irrespective of how they are supposed to work or even how well they work (unknown mostly), probiotics are being used today in food production plants across the United States (just how widely is unknown) in the hopes that they will prevent contamination of our food with human pathogens.
Much like with bacteriophage the problem with probiotics lies not in the technology itself, but rather in how it impacts another technology, that of the detection methods used to monitor for the presence of pathogens in modern food production facilities. Just like with bacteriophage, probiotic bacteria will find their way into the enrichments used to grow up these pathogens to detectable levels, and just like with bacteriophage they will interfere with the ability of these enrichments to do what they were designed to do. In the case of probiotics this might happen because of active inhibition as previously described, and/or through the depletion of nutrients needed by the pathogens to grow (a form of competitive exclusion), and/or through interference with the classical cultural techniques that are used to confirm the results of the widely used molecular detection methods like PCR. The need for and importance of culture confirmation is a controversial and complicated topic that would take another long form article to do justice. It is enough to say that today such confirmation is a requirement of all molecular diagnostic techniques for food pathogens. One weakness of these cultural methods is the difficulties presented when high concentrations of non target microflora are present in the original sample. They can quickly be overwhelmed by the background flora and finding the target pathogen can become difficult or impossible. No matter the mechanism the end result is the same, a negative result by the pathogen screening method or cultural confirmation, even though pathogens were present in the original sample prior to enrichment.
It is because of all these reasons that a moratorium on the use of these two technologies in food production facilities may be warranted until such time as additional data can be generated which demonstrate that interference with the widely used pathogen detection methods is not a potential problem. Until that happens it may only be a matter of time before a false negative result caused by bacteriophage or probiotics triggers a chain of events that leads to an outbreak. Of course it would be difficult or maybe even impossible to link a given outbreak to the use of one of these technologies, and it is also possible that there is no real risk. However, with so many unanswered questions, and so much (hypothetical at least) cause for concern it would seem unwise to ignore all the warning signs and continue on our present course without seriously considering the possibility of unintended negative consequences.