What is Detection?
In the modern food microbiology testing laboratory the word detection and the term detection method have very specific meanings. Both refer to the techniques used to determine the presence/absence of a particular microorganism or microorganisms (typically a pathogen or indicator microorganisms and almost always bacteria) in a given food sample. These presence/absence detection methods are referred to as qualitative detection assays. Quantitative assays are also used and are also (sometimes) referred to as detection methods. These methods allow for a determination of the presence/absence of a particular microorganism or microorganisms and also output an amount of the target microbe(s) present per unit volume/weight. Sometimes, though very rarely, are a qualitative and quantitative method for the same target used to test a given sample.
All current pathogen methods require a pre-enrichment of the sample to enhance the concentration of the target organisms to detectable levels and sometimes also to dilute out food matrix specific assay inhibitors. Only living (viable) microbes will grow so the qualitative pathogen detection assays by their very nature will only detect viable bacteria. Quantitative assays in contrast do not use pre-enrichment. It is currently impossible to back calculate a starting concentration from an enriched sample because of natural matrix and organisms variability so direct plating methods are most often used. They also will only detect and quantify viable bacteria because only living microbes will grow and multiply on the plating media used.
Some examples of qualitative detection methods include end point PCR, real-time PCR, and traditional culture methods for pathogens such as Salmonella, E. coli O157:H7 and other STEC, Listeria monocytogenes, and Genus Listeria. Examples of quantitative detection methods include petrifilm plating, standard agar plating methods, and most probably number (MPN). Target organism(s) include things like Enterobacteriaceae, generic E. coli, Coliforms, Staphylococcus aureus, and many others. All of these methods share a common element which is that the result is expressed as a positive/negative or as a concentration.
Roughly speaking the detection methods can be broken into two groups, the molecular methods and the traditional culture methods. Figure 1 is a high level breakdown of the most common of these various methods categorized by by type and detection mode. As you will note, compared to how many detection methods available, there are a very few options that are actually used in food testing, at least on a regular basis.
Which brings us to the more difficult part of the discussion, what about the (mostly) newer methods such as sequencing (Sanger and next generation sequencing/NGS), spectrophotometric (MALDI-TOF, Raman), various biosensors using optical and/or acoustic and/or electrochemical signal generation/detection modalities? Are these detection methods or something else? I think if you asked an everyday microbiologist working in a food lab with only passing knowledge of these techniques they would say yes for biosensor methods but only because the term has the word sensor (which implies detection) built in, but no or I don’t know for the others. I myself, a highly educated, Ph.D. level research microbiologist with many years of experience in food testing and research labs of all types, still do not consider sequencing a detection method, though a quick search of the literature reveals that papers are being published at a rapid clip in which sequencing is used (or at least referred to) as such. In fact I just returned from a food microbiology conference and attended a session in which 3 of the 5 speakers called various sequencing approaches detection methods throughout their presentations. I was cringing for much of the time but why? Who is right here and does it matter even?
Specifically I want to focus on the sequencing methods. Is DNA sequencing a detection method and does it matter if it is or isn’t referred to as such? Classically, sequencing has been referred to as a characterization method. In food and clinical microbiology one of its most popular uses is in the identification of unknown bacteria, e.g. sequence an unknown culture isolate, compare sequence to database of know strains, output name of unknown based on sequence similarity. Identification and characterization are not detection methods, or at least up until now they have not been thought of or referred to as such, so what has changed? Why the sudden push to “force” these powerful newer methods into a box that previously they had been left out of?
One of the major drivers is the hope of gaining wider acceptance for culture less detection/characterization approaches for the testing of food pathogens. It is believed by some that enrichments and culture based methods are going the way of the dinosaur and the sooner they are phased out the better. While I certainly have sympathy for this view and I even believe and support it for the use of culture based methods as detection systems in and of themselves, I think it is naive and maybe even dangerous to think that an enrichment free method for the detection of the zero tolerance pathogens is possible now or will be anytime in the near future. No form of sequencing or any other method I have seen is capable (or even close) of pulling a single bacterium out of 375g or more of solid food matrix directly, and reliably detecting/characterizing it. Physics simply will not allow it, diffusion rate limitations and the current requirement for liquid phase only samples for all existing detection/characterization platforms precludes it.
Even assuming one could reliably ensure that the single bacterium was released from the solid matrix into a liquid phase (all current detection methods require a liquid phase medium at least at the beginning of the detection process), the volume required to ensure every possible surface was exposed to said liquid would be immense. I have seen a number of physico-chemical separation approaches tried and even attempted some myself, such as liquid phase extraction or or other techniques borrowed from organic chemistry applied to this food matrix extraction problem but none have come close to approaching the performance requirements necessary to handle reliable extraction if a single bacterium from 375g of solid food matrix. This does not even take into consideration the significant cost, throughput, and ease of use constraints that food testing labs are under. Even for food types that presumably should be much easier to handle such as clear liquids where filtration approaches can be used there are other complicating factors that still make this a very difficult problem.
Now suppose this phase separation problem is solved and we have extracted the target into a small volume of liquid, I will be extremely generous and allow that someone has managed to extract a single target out of a 375g chunck of ground beef into a 1ml volume. Sounds great but there is still a major problem as almost every single modern characterization and/or detection platform operates on the micro to nano liter scale. That means there is still another 1000–1000000 fold concentration needed to manage somehow to get the target into an analyzable range volume-wise. Even then it may not be enough depending on the final volume the detection system actually interrogates (and how this is done, e.g. mixing, static, flow through, etc.) vs what it will accept in terms of loading.
Bottom line is that until this confounding problem of sample preparation is solved, or the zero tolerance requirement is significantly relaxed there will never be a culture less detection system. When I started my Ph.D. work in this area in 1996 this (sample prep) was the number one problem facing the world of food pathogen detection, as I write these words in 2017 sadly, very little has changed on that front.