FL: What about known-known substances?
HM: Yes, these terms are often used and may lead to some confusion.
The known-knowns are compounds expected to be in
the sample, frequently found in routine controls; the usual suspects
let's say. If that's all you're interested in, you would typically
measure these with targeted methods, with full analytical quality
control procedures for direct quantitative determination.
Known unknowns would be substances that are not
expected but we know they exist so they might be present.
For these you would use the library-based non-target
screening methods. Unknown unknowns would be pesticides
nobody knows they exist, so they are not included in
any library/database. Detection of these is possible using
non-target measurement but would require different data
processing strategies and is much more challenging.
FL: is that the situation where the different methodologies
show their strengths and weaknesses?
HM: Targeted methods have restrictions with respect to the
number of compounds that can be simultaneously analysed,
their strength is that they offer the lowest detection limits.
Non target methods are often a bit less sensitive, their strength
is that they are more straightforward in terms of instrument
operation, offer the widest scope, and last but not least allow
retrospective data analysis. If at some point something unexpected
comes up, it is helpful to detect the substance: Fipronil
is a good example: Fipronil is a known compound but was
not expect in eggs. If laboratories had analyzed the eggs with
a non-targeted approach they could have gone back to the
raw data and Fipronil would have been easily detected, and
also when the illegal use in poultry started.
FL: would you recommend then to monitor all foods for
HM: in principle I would recommend to measure all food
using non-target approaches, when the required detection
limits can be achieved. For pesticides this is nowadays possible.
With current full scan HRMS instruments, you can obtain
good quantitative data. So basically you could transfer
the multi-residue method normally done on a target MS/MS
instrument to a non-target full scan HRMS instrument. For
quantitative analysis of the 200 frequently found pesticides,
you do your calibration, data handling and quality control as
normal. Then, using the same raw data, you can do a second
data processing procedure to screen the samples for 500 or
more additional pesticides or other contaminants. This way,
you have the additional screening as option and, if used, at
low additional costs. I foresee a gradual replacement of triple
quad MS/MS instruments by full scan HRMS instruments.
Gradual because it would be too costly to replace all existing
and still well-functioning MS/MS instruments at once by full
scan HRMS instruments. In addition, it takes time to implement
new technologies and workflows, especially in an accredited
routine laboratory environment.
FL: How about the real versus the perceived risk? Authorities
like BfR are telling us there is no risk.
HM: Well, in case of pesticides, if the maximum residue limit is
exceeded, this does not necessarily mean there is a health risk.
The limit is not only based on toxicity, but also on Good Agricultural
Practice (GAP). According to GAP, the amount of pesticide
allowed to be applied on the crop is based on what is needed
to manage the pest. If that leaves only a low residue that is not
toxicologically relevant, the limit will still be based on that low
residue. But this is hard to understand for consumers because
for them exceeding a limit automatically means risk. While this
might be the case for contaminants such as mycotoxins or dioxins,
this is mostly not the case for pesticide residues. A potential
problem lies more in the exposure to multiple pesticides at the
same time. Here a knowledge gap exists and more clarity is
needed on potential synergistic effects.
FL: With the RAFA Recent Advances in Food Analyses conference
in November in Prague on the horizon: what do you expect the
chemical instrumental analysis to come up with in the future?
What is the next level? Will limits be reduced to lower levels?
HM: Instrumentation is continuously improving in terms of
sensitivity and selectivity, user friendliness, and data processing
software, including cloud-based libraries. All this is obviously
advantageous for food safety analyses. The improved
sensitivity may lead to more detects, but not automatically
lead to lower maximum limits just because we can. Besides
improvement of existing techniques, new options to further
increase in separation power are becoming available such as
multidimensional (2D) chromatography, ion mobility spectrometry,
and combining this with (high resolution) mass
spectrometry. Whether these more advanced options are
needed depends on the application.
FL: Time consumption of sample prep is not a big deal but
comparison of raw data against data bases used in non-target
approaches takes a week. What is your expectation?
HM: data processing in non-target approaches was very
time consuming in the past but nowadays software is more
powerful and fit-for-purpose. There is still room for improvement,
but is it fit for use.
Regarding sample prep, yes for the generic multi-residue
methods this is straightforward and not a big deal.
It is good to mention that multi-residue methods, target
or non-target cannot solve everything. There are pesticides
such as glyphosate which have very different physical
chemical properties. They do not fit in generic QuEChERS
type methods. They need a dedicated extraction, a special
LC column etc.
FL: It seems that due to increasing consumers‘ and politicians‘
awareness there might be a trend towards the
more screening-oriented methods?
HM: Well, with increasing attention in circular economy we
may recycle everything along the food/feed chain. So it is
wise to keep our eyes wide open for things you are perhaps
not expecting. Things may re-enter the food chain.
FL: Thank you.