MLLE is developed for epidemiological studies, human and ecological risk assessments [68,69].
MLLE as adapted by the US Natural Resource Management (NRM) [70] is mainly used to explore
cause-effect relationships [71].
BBN is a probabilistic model that represents a set of random variables and their conditional interdependencies via a directed acyclic graph (DAG).
For example, a BBN could represent the probabilistic relationships between diseases and symptoms. Given symptoms, the network can be used to compute the probabilities of various diseases.
BBN is a method for the integration of the best possible data from a variety of sources [62,66].
Furthermore, as in any monitoring programme, efforts are needed to reduce uncertainties during data collection and data analysis.
There are many sources of uncertainty in an IEHM programme which can generally be divided into two groups, quantitative uncertainty and qualitative uncertainty [33].
Quantitative uncertainty may derive from a lack of precision (e.g., variation in measurement due to insufficient number of observations, random sampling error) or a lack of accuracy (e.g., inaccuracies in observations, deriving from structural measurement errors, inappropriate extrapolation, confounding, etc.).
Qualitative uncertainty indicates things that we do not know, but that cannot be captured in a statistical sense, e.g., how to quantify differences of opinion between scientists, differences in the framing of a problem, or inconsistencies in the scientific knowledge base? [33].
By linking to our structural work process in an IEHM programme, the quantitative uncertainty would be most likely restricted to step 2 to step 5 above, while qualitative uncertainties would be found at all steps.
All types of uncertainty require handling with appropriate techniques and there are a broad set of tools to do so. In summary, three types of techniques can be used for analysing uncertainties in the IEHM programmes: (i) Data quality assessment: Methods (e.g., Aguila tool [72], the Numerical, Unit, Spread, Assessment and Pedigree (NUSAP) system [73,74]) for a data
quality assessment can be used to deal with the quantitativeuncertainty, by evaluating whether data are fit for purpose.
Such assessment involves “the scientific and statistical evaluation of data to determine whether they
meet the objectives of the project, and thus are of the right type, quality, and quantity to support their intended use [75]”; (ii) Expert elicitation: this approach can be used to deal with qualitative uncertainty by consulting experts as a means to derive preliminary estimates
for information about which scientific knowledge is as yet incomplete or inconsistent [33,75,76]; (iii) Methods based upon a typology of uncertainty: a typology of uncertainty can help to structure the different types of uncertainties (e.g., contextual uncertainty, model structure uncertainty, parameter uncertainty, input data uncertainty, etc.) [33,70]. This can in turn help to identify useful methods and techniques to deal with the uncertainties, ranging from stakeholder discussion to sensitivity and decision analyses [33,70].
Sensitivity and decision analyses can help to identify which sources of uncertainty mostly affect
the final results [78-80] and the relative importance of each uncertain element. Once the major sources of uncertainty are known and prioritization is finished, suitable tools can be selected for further analysis. The uncertainty tool catalogue by Van der Sluijs et al.
[81] provides guidance for selecting appropriate methods that match the characterization of the uncertainty in the typology.
Refsgaard et al. [82] also describe various methods for dealing with uncertainties, and explain which purposesthey may serve.
Finally, Quality assurance/Quality control (QA/QC) is one of the most critical components of an IEHM programme, and should make use of standard operating procedures (SOP) to provide data of known quality.
The generation of reliable field and analytical data is best achieved through the development and implementation of a QA/QC plan [83,84].
Development of a rigorous QA/QC plan should be done in cooperation with all stakeholdersthroughout the entire monitoring period [85].
The fundamental elements of a QA/QC plan are: (1) Data quality objectives (DQOs) [86] are used to establish performance and acceptance criteria for field and laboratory measurement processes and set levels of acceptable measurement error. DQO is usually established for five aspects of data quality: representativeness, completeness, comparability, accuracy, and precision; (2)
Auditing: Undertaking regular audits of field, laboratory and data management operations provides valuable feedback on the adequacy, implementation, and effectiveness of existing quality systems. Regular quality audits allow for the information gained to be used to make improvements to the quality system or plan and provide a benchmark for maintaining a level of competence; (3)
Data generation and acquisition: In accordance with established DQOs, the generation and acquisition of high-quality monitoring data relies on the adherence to quality control measures during field sampling operations and laboratory analyses.
Typically, quality control is maintained during field work operations by adhering to standardized sampling protocols, analysis of QA/QCsamples, and the regular calibration and maintenance of
field instrumentation; (4) Data validation and usability: The quality review and validation of data can be readily undertaken by assessing all data for compliance with the project’s DQOs. Quality checks are also undertaken along all data flow paths, particularly at data entry steps.
If data are found to fall outside the accepted DQO limits, then corrective actions (e.g., re-analyse suspect samples, re-sampling and reanalysis, accept data with an acknowledged level of bias and imprecision, discard data, etc.) may be undertaken; and (5) Data Management: Data quality through the various data generation, acquisition, assessment and storage processes should be managed by a series of quality control measures.
These can include managing the chain of custody; defined data flow paths, the use of standard field data sheets and laboratory reports, and information management (database) systems.
In addition, QA/QA and uncertainties are closely linked. For example, QA/QC requirements can improve accuracy and reduce uncertainty, while in turn, reducing uncertainty can improve data of known quality.
In practice, the assessment of uncertainty and QA/QC may need to be implemented in parallel, e.g., laboratories should report an estimate of the uncertainties associated with each measurement.
Discussion
The approaches to IEHM for IEHIA outlined in this paper combine two main components: an approach for designing and carrying out a realistic IEHM programme of complex, systematic E&H problems (IEHM conceptual framework); and a qualitative approach for integrated data from existingmultiple monitoring programmes (IEHM structural workprocess, data integration-uncertainty-QA/QC methods).
Neither of these approaches is without its limitations and challenges.
The former relate to the design/operation challenges of an IEHM programme; the latter implies the ability to integrate data from multiple monitoring sources.
Challenges
Following the IEHM framework and IEHM work process, a realistic IEHM programme may need to integrate existing monitoring programmes. This may generate a constraint on the harmonizing measurement techniques [52].
For instance, many of the standard design rules and methods used in the establishment of a new IEHM programmecannot be easily applied since existing E&H monitoring programmes often have specific monitoring objectives with different measurement protocols and sampling designs.
These constraints on the form and function of an IEHM programme may compromise and complicate some key design issues, but in some respects this makes it even more important that design issues are fully considered at the onset [52].
In the design process, the expectations of end-users also need to be considered [52]. Unlike purely researchorientatedactivities, an IEHM programme should also provide input data for policy makers and other stakeholders. In this context, partial information from an IEHM programme may not be relevant if policy needs require information to be valid at higher levels of integration and harmonization. Data from an IEHM programme musttherefore be defensible against criticism, e.g., representativeness, precision, consistency and reproducibility.
These goals can be achieved by thorough planning and early stakeholders’ participation in all the steps of the IEHM design process [52].
Combining the DPSEEA (cause-effect approach) with natural-eco-anthropogenic systems (systematic approach), an IEHM programme has to be highly interdisciplinary and must therefore be designed taking into account the priorities, perspective and expertise of stakeholders at different
levels.
When support for decisions-making is needed, the issues underlying operational choices should
be understandable to their intended audience, which could be a significant challenge, but could be achieved by the ‘analytical-deliberative’ approach of the National Research Council in the USA [13] and the extended peer community approach [87,88].
In addition, the procedure that includes a diversity of actors for selection of hot-spots for human biomonitoring research in Belgium developed by Keune et al. (2010) [89], could also be considered to develop and inform the programme design. Given the inherent complexity of E&H problems, it may be one of the first disciplines to benefit from the current paradigm shift from multi-disciplinary science to transdisciplinary science to tackle complex societal issues. In addition, current data from E&H monitoring programmesface many challenges: (i) fragmentation o