ABSTRACT: Practitioners and consumers of risk assessments often wonder whether the trend toward more complex risk models, incorporating increasing amounts of biological knowledge and increasing numbers of biologically interpretable parameters, actually lead to better risk estimates. A contrary view might be that the need to estimate more uncertain quantities undermines the advantages of greater descriptive realism so much that the final risk estimates are less certain than the ones traditionally obtained from simpler, less realistic, statistical curve-fitting models. In opposition to this pessimistic view is the widespread common-sense notion that including more information in a risk model can never worsen (and will usually improve) the resulting risk estimates. This paper appeals to mathematical arguments to resolve these conflicting intuitions. First, it emphasizes the fact that risk depends on multiple inputs only through a small number of &quot;reduced quantities &quot;-- perhaps on only one, which would then be defined as internal dose. Thus, uncertainty about risk may have limited sensitivity to uncertainties in the original input quantities. The concept of internal dose and its possible algebraic relations to the original input quantities are clarified using concepts from dimensional analysis. Then, the question of whether greater model complexity leads to better or worse risk estimates is addressed in an information-theoretic framework, using entropies of probability distributions to quantify uncertainties. Within this framework, it is shown that models with greater intrinsic or &quot;structural &quot; complexity (meaning complexity that can not be eliminated by reformulating the model in terms of its reduced quantities) lead to better-informed, an
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.