3. Exposure assessment
Exposure is the product of intake and biological availability. As
a conservative (worst case) scenario, biological availability is frequently
assumed to be complete. This is a pragmatic approach as
the true human biological availability of a chemical contaminant
may be very difficult to ascertain and the experiments required
might be unethical. The intake component equates to the amount
of chemical in the aggregated consumption of foods (including
beverages andwater). This is normally calculated from the concentration
measured in food (or beverage) and a statistical evaluation
of the consumption of food by the general population. Numerous
dietary surveys have been conducted in member states, including
the UK [11,12] and Netherlands [13]. The surveys provide representative
data on the distribution of consumption of different foods
by age and socio-economic groups. The data are used to estimate
food intake (and thus chemical intake) for consumers. The availability
of adequate data on dietary intake (including monitoring
of trends in consumption over many years and across regional,
age and other group identities) means that, in certain cases, the
factor that most greatly influences variability in exposure assessment
is analytical measurement – i.e. analytical uncertainty can
significantly affect exposure estimates. This uncertainty has several
components, including measurement inaccuracy and, sometimes
more significantly, sampling error. It is sometimes very difficult to
sample in away that is fully representative. For example, representative
sampling of the diet of a human population should reflect not
only national dietary norms but ethnic, religious, regional, age and
socio-economic factors in dietary choice also. Few dietary surveys
for food chemical residues consider such factors. The costs of highend,
confirmatory analyses limit the number of tests conducted,
resulting in the submission of composite samples. Such “average”
samples may obscure the true range of concentrations at which
contaminants are found. If analytical capacity can be expanded
by developing lower cost, high-throughput methods, a truer estimate
of exposure – and perhaps improved focus on the exposure
of sub-populations – would be possible.
For certain classes of chemical thatmay be present in food, regulatory
limitshavenot been established.Acommonreason is that the
chemical is not expected to be present in food at a level of concern
or has not previously been encountered in food at an appreciable
level. Consequently, the toxicological information required to
decide on an acceptable level of dietary exposure may not exist.
However, the purpose of producing exposure data is to assess the
related risk, so some form of risk assessment paradigm is needed
to place the data in context. In such cases, a systematic process can
be applied to conservatively assess the toxicological hazard associated
with poorly characterised chemicals [14]. This information
can be applied to derive a Threshold of Toxicological Concern for
substances present at low levels in the diet [15]. If analyses are
conducted for new or unexpected residues in the food chain, such
approaches may be valuable in the interpretation of data where no
other risk assessment tool is currently available.
For unexpected or previously unknown chemicals, such analyses
will probably be at the screening level, not least because the
methodology used will be at an early stage of development. Screening
methods may provide the first evidence of a potential problem
with regard to exposure. The requirements of screening techniques
can be very different to the expectations of confirmatory analysis.
Screening methods require sensitivity but can exhibit reduced
selectivity, meaning that they may produce false positive results.
Of course, this is preferable to the generation of false negative
results if the intention is to identify all possibly affected samples for
further investigation. Although some screens utilise direct instrumental
measurement, many are indirect and explore molecular or