5,983 research outputs found

    what exactly are you inferring? A closer look at hypothesis testing

    Get PDF
    This critical review describes the confused application of significance tests in environmental toxicology and chemistry that often produces incorrect inferences and indefensible regulatory decisions. Following a brief review of statistical testing theory, nine recommendations are put forward. The first is that confidence intervals be used instead of hypothesis tests whenever possible. The remaining recommendations are relevant if hypothesis tests are used. They are as follows: Define and justify Type I and II error rates a priori; set and justify an effect size a priori; do not confuse p(E vertical bar H-0) and p(H-0 vertical bar E); design tests permitting Positive Predictive Value estimation; publish negative results; estimate a priori, not post hoe, power; as warranted by study goals, favor null hypotheses that are not conventional nil hypotheses; and avoid definitive inferences from isolated tests

    Projected Hg dietary exposure of 3 bird species nesting on a contaminated floodplain (South River, Virginia, USA)

    Get PDF
    Dietary Hg exposure was modeled for Carolina wren (Thryothorus ludovicianus), Eastern song sparrow (Melospiza melodia), and Eastern screech owl (Otus asio) nesting on the contaminated South River floodplain (Virginia, USA). Parameterization of Monte-Carlo models required formal expert elicitation to define bird body weight and feeding ecology characteristics because specific information was either unavailable in the published literature or too difficult to collect reliably by field survey. Mercury concentrations and weights for candidate food items were obtained directly by field survey. Simulations predicted the probability that an adult bird during breeding season would ingest specific amounts of Hg during daily foraging and the probability that the average Hg ingestion rate for the breeding season of an adult bird would exceed published rates reported to cause harm to other birds (\u3e 100ng total Hg/g body weight per day). Despite the extensive floodplain contamination, the probabilities that these species\u27 average ingestion rates exceeded the threshold value were all \u3c 0.01. Sensitivity analysis indicated that overall food ingestion rate was the most important factor determining projected Hg ingestion rates. Expert elicitation was useful in providing sufficiently reliable information for Monte-Carlo simulation

    Regression or significance tests: What other choice is there?-An academic perspective Response

    Get PDF
    Both the no-observed-effect concentration and its null hypothesis significance testing foundation have drawn steady criticism since their inceptions [1–5]. Many in our field reasonably advocate regression to avoid conventional null hypothesis significance testing shortcomings; however, regression is compromised under commonly encountered conditions (Green,present Perspective’s Challenge). As the debate to favor null hypothesis significance testing or regression methods continues into the 21st century, a sensible strategy might be to take a moment to ask, Are there now other choices? Our goal is to sketch out 1 such choice

    A New Analysis Method for Simulations Using Node Categorizations

    Full text link
    Most research concerning the influence of network structure on phenomena taking place on the network focus on relationships between global statistics of the network structure and characteristic properties of those phenomena, even though local structure has a significant effect on the dynamics of some phenomena. In the present paper, we propose a new analysis method for phenomena on networks based on a categorization of nodes. First, local statistics such as the average path length and the clustering coefficient for a node are calculated and assigned to the respective node. Then, the nodes are categorized using the self-organizing map (SOM) algorithm. Characteristic properties of the phenomena of interest are visualized for each category of nodes. The validity of our method is demonstrated using the results of two simulation models. The proposed method is useful as a research tool to understand the behavior of networks, in particular, for the large-scale networks that existing visualization techniques cannot work well.Comment: 9 pages, 8 figures. This paper will be published in Social Network Analysis and Mining(www.springerlink.com

    Searching for network modules

    Full text link
    When analyzing complex networks a key target is to uncover their modular structure, which means searching for a family of modules, namely node subsets spanning each a subnetwork more densely connected than the average. This work proposes a novel type of objective function for graph clustering, in the form of a multilinear polynomial whose coefficients are determined by network topology. It may be thought of as a potential function, to be maximized, taking its values on fuzzy clusterings or families of fuzzy subsets of nodes over which every node distributes a unit membership. When suitably parametrized, this potential is shown to attain its maximum when every node concentrates its all unit membership on some module. The output thus is a partition, while the original discrete optimization problem is turned into a continuous version allowing to conceive alternative search strategies. The instance of the problem being a pseudo-Boolean function assigning real-valued cluster scores to node subsets, modularity maximization is employed to exemplify a so-called quadratic form, in that the scores of singletons and pairs also fully determine the scores of larger clusters, while the resulting multilinear polynomial potential function has degree 2. After considering further quadratic instances, different from modularity and obtained by interpreting network topology in alternative manners, a greedy local-search strategy for the continuous framework is analytically compared with an existing greedy agglomerative procedure for the discrete case. Overlapping is finally discussed in terms of multiple runs, i.e. several local searches with different initializations.Comment: 10 page

    Worldwide food recall patterns over an eleven month period: A country perspective.

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Following the World Health Organization Forum in November 2007, the Beijing Declaration recognized the importance of food safety along with the rights of all individuals to a safe and adequate diet. The aim of this study is to retrospectively analyze the patterns in food alert and recall by countries to identify the principal hazard generators and gatekeepers of food safety in the eleven months leading up to the Declaration.</p> <p>Methods</p> <p>The food recall data set was collected by the Laboratory of the Government Chemist (LGC, UK) over the period from January to November 2007. Statistics were computed with the focus reporting patterns by the 117 countries. The complexity of the recorded interrelations was depicted as a network constructed from structural properties contained in the data. The analysed network properties included degrees, weighted degrees, modularity and <it>k</it>-core decomposition. Network analyses of the reports, based on 'country making report' (<it>detector</it>) and 'country reported on' (<it>transgressor</it>), revealed that the network is organized around a dominant core.</p> <p>Results</p> <p>Ten countries were reported for sixty per cent of all faulty products marketed, with the top 5 countries having received between 100 to 281 reports. Further analysis of the dominant core revealed that out of the top five transgressors three made no reports (in the order China > Turkey > Iran). The top ten detectors account for three quarters of reports with three > 300 (Italy: 406, Germany: 340, United Kingdom: 322).</p> <p>Conclusion</p> <p>Of the 117 countries studied, the vast majority of food reports are made by 10 countries, with EU countries predominating. The majority of the faulty foodstuffs originate in ten countries with four major producers making no reports. This pattern is very distant from that proposed by the Beijing Declaration which urges all countries to take responsibility for the provision of safe and adequate diets for their nationals.</p

    Do on-farm natural, restored, managed and constructed wetlands mitigate agricultural pollution in Great Britain and Ireland?

    Get PDF
    Wetlands in agricultural landscapes offer a number of benefits to the landscape function in which they are set, reducing nutrient runoff, providing additional habitat mosaics and offering various ecosystem services. They require careful planning and maintenance in order to perform their optimum design function over a prolonged period of time. They should be treated as functional units of farm infrastructure rather than fit-and-forget systems. A high priority topic within the Department for Environment, Food and Rural Affairs (DEFRA) water quality programme is the mitigation of pollution from agriculture. This programme was set up to meet the requirements of the European Water Framework Directive (WFD) EU (2000). Nutrient loss from agricultural land has been suggested as a major cause of elevated nutrient concentrations in surface waters in the UK. Nitrogen (N) and phosphorus (P) are of particular concern as an excess of either nutrient can lead to eutrophication of freshwater systems and coastal waters. Agriculture has also been identified as a significant source of suspended sediment (SS) concentrations in UK rivers and agriculturally derived sediment has been identified as a source of increased bed-sediment P concentrations in rivers. High bed sediments loads have other negative impacts, such as clogging river gravels reducing fish spawning. There is considerable evidence in the published and grey literature that wetlands have the ability to remove nutrients and sediment and thus reduce the load on receiving waters. Wetlands have also been reported to perform other ecosystem services, such as reducing floods, supporting biodiversity and sequestering carbon. A policy to promote the conservation, management, restoration or construction of wetlands could help to mitigate the impacts of N, P and SS from agriculture delivering requirements of WFD through Catchment Sensitive Farming following an Ecosystem Approach and Catchment Based Approach promoted by Defra. It could also meet other commitments such as implementing the Ramsar and Biodiversity Conventions to which the UK is a signatory. However, the term wetlands covers a wide range of habitat types and it is important that policy makers are provided with accurate, robust and independently reviewed information on the degree to which different types of wetland perform these services under different circumstances, so that policy can most best targeted. This systematic review assesses the available evidence on the performance of various wetland types on farms to reduce nutrient input and suspended sediments to receiving waters. It provides a defensible evidence base on which to base policy. The studies reviewed cover different input loads and the analysis compares performance of these wetland systems in respect of % reduction efficiency. In England and Wales, Defra, working closely with the Environment Agency and Natural England, has commissioned this systematic review on how effective, and what influences the effectiveness of wetlands at mitigating N, P and SS inputs from agriculture to receiving freshwater in the United Kingdom and Ireland

    Geographic constraints on social network groups

    Get PDF
    Social groups are fundamental building blocks of human societies. While our social interactions have always been constrained by geography, it has been impossible, due to practical difficulties, to evaluate the nature of this restriction on social group structure. We construct a social network of individuals whose most frequent geographical locations are also known. We also classify the individuals into groups according to a community detection algorithm. We study the variation of geographical span for social groups of varying sizes, and explore the relationship between topological positions and geographic positions of their members. We find that small social groups are geographically very tight, but become much more clumped when the group size exceeds about 30 members. Also, we find no correlation between the topological positions and geographic positions of individuals within network communities. These results suggest that spreading processes face distinct structural and spatial constraints.Comment: 10 pages, 5 figure

    One-loop Quantum Gravity in Schwarzschild Spacetime

    Get PDF
    The quantum theory of linearized perturbations of the gravitational field of a Schwarzschild black hole is presented. The fundamental operators are seen to be the perturbed Weyl scalars Ψ˙0\dot\Psi_0 and Ψ˙4\dot\Psi_4 associated with the Newman-Penrose description of the classical theory. Formulae are obtained for the expectation values of the modulus squared of these operators in the Boulware, Unruh and Hartle-Hawking quantum states. Differences between the renormalized expectation values of both Ψ˙02\bigl| \dot\Psi_0 \bigr|^2 and Ψ˙42\bigl| \dot\Psi_4 \bigr|^2 in the three quantum states are evaluated numerically.Comment: 39 pages, 11 Postscript figures, using revte

    Calibration of the distance scale from galactic Cepheids: I Calibration based on the GFG sample

    Get PDF
    New estimates of the distances of 36 nearby galaxies are presented based on accurate distances of galactic Cepheids obtained by Gieren, Fouque and Gomez (1998) from the geometrical Barnes-Evans method. The concept of 'sosie' is applied to extend the distance determination to extragalactic Cepheids without assuming the linearity of the PL relation. Doing so, the distance moduli are obtained in a straightforward way. The correction for extinction is made using two photometric bands (V and I) according to the principles introduced by Freedman and Madore (1990). Finally, the statistical bias due to the incompleteness of the sample is corrected according to the precepts introduced by Teerikorpi (1987) without introducing any free parameters (except the distance modulus itself in an iterative scheme). The final distance moduli depend on the adopted extinction ratio {R_V}/{R_I} and on the limiting apparent magnitude of the sample. A comparison with the distance moduli recently published by the Hubble Space Telescope Key Project (HSTKP) team reveals a fair agreement when the same ratio {R_V}/{R_I} is used but shows a small discrepancy at large distance. In order to bypass the uncertainty due to the metallicity effect it is suggested to consider only galaxies having nearly the same metallicity as the calibrating Cepheids (i.e. Solar metallicity). The internal uncertainty of the distances is about 0.1 magnitude but the total uncertainty may reach 0.3 magnitude.Comment: 12 pages, 4 figures, access to a database of extragalactic Cepheids. Astronomy & Astrophysics (in press) 200
    corecore