Deterministic & Probabilistic Assessments
Monte Carlo (MC) and deterministic methods use the same basic model involving an equation that relates the various parameters likely to affect exposure and/or risk. However, they differ in the data used to represent these parameters:
 Deterministic methods use point estimates which are often, but not necessarily always, worst case estimates
 Monte Carlo methods use distributions for one, some, or all of the parameters in the equation
Input parameters for both the MC and deterministic methods should reflect the exposure timeframe of interest (e.g., chronic, subchronic, acute). Since deterministic methods use point estimates to represent input parameters (e.g., use frequencies, rates, etc.), use of average values to represent these parameters could result in underestimates of the exposures, while use of upper bound estimates would result in overly conservative estimates of these exposures. On the other hand, MC methods allow for the use of distributions to represent the longterm profiles, and distributions of per subject exposures and risks can be generated. However, the data to construct these longterm profiles may not be available, and care must be taken to maintain within subject potential correlations between input parameters (e.g., the hand surface area at a given age should be correlated to that at subsequent ages).
The point estimates and distributions used to represent the parameters in deterministic and MC models could be based on experimental studies, observational studies, or expert judgment when actual data do not exist. The distributions used to represent the parameters in an MC risk assessment can be modeled using empirical data or statistical distributions (e.g., lognormal distribution, triangular distributions, etc.). From a resource perspective, it is more efficient to use MC modeling only when deterministic approaches fail. The main advantage to adopting MC techniques for exposure assessment is that they can lead to a more accurate estimate of the range of exposures, especially with respect to demonstrating the conservatism of the maximum point estimate deterministic approach. MC assessments should include sensitivity and uncertainty analyses to assess the impact of influential parameters on the variability and uncertainty in the outcome distribution:
 Sensitivity analyses assess the impact of variability in input parameters on variability in outcomes. Typical analyses include a measurement of the correlation between specific input parameters and the outcome, or assessment of the change in the outcome when a single input parameter is allowed to vary, while the others are kept constant (e.g., at their mean or median values).
 Uncertainty analyses assess the impact of the uncertainty in input parameters on the uncertainty in the output, and 2D MC methods are typically used to run such analyses.
How Can Exponent Help You?
Mathematical techniques for MC are readily implemented using standard “off the shelf” software (e.g., @RISK^{®}, Crystal Ball^{®}, various statistical packages). In addition, standalone models can be developed.
Probabilistic Assessments Using Exponent’s StandAlone Models: Exponent can conduct dietary and residential exposure assessments using software specially designed for these tasks (DEEMFCID, FARE, CALENDEX). The models have been reviewed by the FIFRA Science Advisory Panel of the EPA and are used by government agencies to conduct exposure assessments.
Development of Custom Designed Models to Address Specific Questions
Exponent has also developed several probabilistic models using Crystal Ball, an Excel addon customized to address specific questions. For instance:
 Chemical contaminants and food additives: Exponent has developed several probabilistic dietary exposure models to assess shortterm and longterm intakes of chemical contaminants and food additives
 Microbial risk assessment: Exponent has developed probabilistic risk assessment models to estimate exposure and risk to microbes in foods and to assess the impact of mitigation strategies on these estimates
Evaluation of Probabilistic Models Developed by Regulatory Agencies
Exponent scientists have also reviewed and provided objective evaluations of several probabilistic models derived by government agencies for use in regulatory assessments. For instance:
 On behalf of a consortium of food companies, Exponent has reviewed the model derived by FDA for assessing exposure and risk to listeria monocytogenes in ready to eat foods, and has reviewed the model derived by USDA FSIS for assessing exposure and risk to listeria monocytogenes in deli meats and for assessing the impact of potential mitigation strategies
 On behalf of an industry group, Exponent has reviewed the model derived by US EPA for assessing exposures to chemicals used in treating wood used in the construction of outdoor decks and play structures and has assessed the impact using alternative data in this model.
Professionals

Caroline Harris, Ph.D., CChem, FRSCChemical Regulation & Food SafetyCorporate Vice President, Director of UK Offices, Practice Director, & Principal ScientistHarrogate (UK) & Basel (Switzerland) & Derby (UK) & DÃ¼sseldorf (Germany) & London (UK)

Carolyn G. Scrafford, Ph.D., M.P.H.Chemical Regulation & Food SafetySenior Managing Scientist & Office DirectorDistrict of Columbia

Leila M. Barraj, D.Sc.Chemical Regulation & Food SafetySenior Managing ScientistDistrict of Columbia
Knowledge

{{knowledge.title}}{{knowledge.date}}