794,875 research outputs found

    Robustness of multiple testing procedures against dependence

    Full text link
    An important aspect of multiple hypothesis testing is controlling the significance level, or the level of Type I error. When the test statistics are not independent it can be particularly challenging to deal with this problem, without resorting to very conservative procedures. In this paper we show that, in the context of contemporary multiple testing problems, where the number of tests is often very large, the difficulties caused by dependence are less serious than in classical cases. This is particularly true when the null distributions of test statistics are relatively light-tailed, for example, when they can be based on Normal or Student's tt approximations. There, if the test statistics can fairly be viewed as being generated by a linear process, an analysis founded on the incorrect assumption of independence is asymptotically correct as the number of hypotheses diverges. In particular, the point process representing the null distribution of the indices at which statistically significant test results occur is approximately Poisson, just as in the case of independence. The Poisson process also has the same mean as in the independence case, and of course exhibits no clustering of false discoveries. However, this result can fail if the null distributions are particularly heavy-tailed. There clusters of statistically significant results can occur, even when the null hypothesis is correct. We give an intuitive explanation for these disparate properties in light- and heavy-tailed cases, and provide rigorous theory underpinning the intuition.Comment: Published in at http://dx.doi.org/10.1214/07-AOS557 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Estimators of the multiple correlation coefficient: local robustness and confidence intervals.

    Get PDF
    Many robust regression estimators are defined by minimizing a measure of spread of the residuals. An accompanying R-2-measure, or multiple correlation coefficient, is then easily obtained. In this paper, local robustness properties of these robust R-2-coefficients axe investigated. It is also shown how confidence intervals for the population multiple correlation coefficient can be constructed in the case of multivariate normality.Cautionary note; High breakdown-point; Influence function; Intervals; Model; Multiple correlation coefficient; R-2-measure; Regression analysis; Residuals; Robustness; Squares regression;

    Robustness of the Fractal Regime for the Multiple-Scattering Structure Factor

    Full text link
    In the single-scattering theory of electromagnetic radiation, the {\it fractal regime} is a definite range in the photon momentum-transfer qq, which is characterized by the scaling-law behavior of the structure factor: S(q)1/qdfS(q) \propto 1/q^{d_f}. This allows a straightforward estimation of the fractal dimension dfd_f of aggregates in {\it Small-Angle X-ray Scattering} (SAXS) experiments. However, this behavior is not commonly studied in optical scattering experiments because of the lack of information on its domain of validity. In the present work, we propose a definition of the multiple-scattering structure factor, which naturally generalizes the single-scattering function S(q)S(q). We show that the mean-field theory of electromagnetic scattering provides an explicit condition to interpret the significance of multiple scattering. In this paper, we investigate and discuss electromagnetic scattering by three classes of fractal aggregates. The results obtained from the TMatrix method show that the fractal scaling range is divided into two domains: 1) a genuine fractal regime, which is robust; 2) a possible anomalous scaling regime, S(q)1/qδS(q) \propto 1/q^{\delta}, with exponent δ\delta independent of dfd_f, and related to the way the scattering mechanism uses the local morphology of the scatterer. The recognition, and an analysis, of the latter domain is of importance because it may result in significant reduction of the fractal regime, and brings into question the proper mechanism in the build-up of multiple-scattering.Comment: 9 pages, 4 figures, accepted for publication in Journal of Quantitative Spectroscopy and Radiative Transfer (JQSRT

    Integrated multiple mediation analysis: A robustness–specificity trade-off in causal structure

    Get PDF
    Recent methodological developments in causal mediation analysis have addressed several issues regarding multiple mediators. However, these developed methods differ in their definitions of causal parameters, assumptions for identification, and interpretations of causal effects, making it unclear which method ought to be selected when investigating a given causal effect. Thus, in this study, we construct an integrated framework, which unifies all existing methodologies, as a standard for mediation analysis with multiple mediators. To clarify the relationship between existing methods, we propose four strategies for effect decomposition: two-way, partially forward, partially backward, and complete decompositions. This study reveals how the direct and indirect effects of each strategy are explicitly and correctly interpreted as path-specific effects under different causal mediation structures. In the integrated framework, we further verify the utility of the interventional analogues of direct and indirect effects, especially when natural direct and indirect effects cannot be identified or when cross-world exchangeability is invalid. Consequently, this study yields a robustness–specificity trade-off in the choice of strategies. Inverse probability weighting is considered for estimation. The four strategies are further applied to a simulation study for performance evaluation and for analyzing the Risk Evaluation of Viral Load Elevation and Associated Liver Disease/Cancer data set from Taiwan to investigate the causal effect of hepatitis C virus infection on mortality

    Potential landscape-scale pollinator networks across Great Britain: structure, stability and influence of agricultural land cover

    Get PDF
    Understanding spatial variation in the structure and stability of plant-pollinator networks, and their relationship with anthropogenic drivers, is key to maintaining pollination services and mitigating declines. Constructing sufficient networks to examine patterns over large spatial scales remains challenging. Using biological records (citizen science), we constructed potential plant-pollinator networks at 10km resolution across Great Britain, comprising all potential interactions inferred from recorded floral visitation and species co-occurrence. We calculated network metrics (species richness, connectance, pollinator and plant generality) and adapted existing methods to assess robustness to sequences of simulated plant extinctions across multiple networks. We found positive relationships between agricultural land cover and both pollinator generality and robustness to extinctions under several extinction scenarios. Increased robustness was attributable to changes in plant community composition (fewer extinction-prone species) and network structure (increased pollinator generality). Thus, traits enabling persistence in highly agricultural landscapes can confer robustness to potential future perturbations on plant-pollinator networks

    Exploring tradeoffs in pleiotropy and redundancy using evolutionary computing

    Full text link
    Evolutionary computation algorithms are increasingly being used to solve optimization problems as they have many advantages over traditional optimization algorithms. In this paper we use evolutionary computation to study the trade-off between pleiotropy and redundancy in a client-server based network. Pleiotropy is a term used to describe components that perform multiple tasks, while redundancy refers to multiple components performing one same task. Pleiotropy reduces cost but lacks robustness, while redundancy increases network reliability but is more costly, as together, pleiotropy and redundancy build flexibility and robustness into systems. Therefore it is desirable to have a network that contains a balance between pleiotropy and redundancy. We explore how factors such as link failure probability, repair rates, and the size of the network influence the design choices that we explore using genetic algorithms.Comment: 10 pages, 6 figure

    Modular networks emerge from multiconstraint optimization

    Get PDF
    Modular structure is ubiquitous among complex networks. We note that most such systems are subject to multiple structural and functional constraints, e.g., minimizing the average path length and the total number of links, while maximizing robustness against perturbations in node activity. We show that the optimal networks satisfying these three constraints are characterized by the existence of multiple subnetworks (modules) sparsely connected to each other. In addition, these modules have distinct hubs, resulting in an overall heterogeneous degree distribution.Comment: 5 pages, 4 figures; Published versio
    corecore