18,081 research outputs found

    Minimax Structured Normal Means Inference

    Full text link
    We provide a unified treatment of a broad class of noisy structure recovery problems, known as structured normal means problems. In this setting, the goal is to identify, from a finite collection of Gaussian distributions with different means, the distribution that produced some observed data. Recent work has studied several special cases including sparse vectors, biclusters, and graph-based structures. We establish nearly matching upper and lower bounds on the minimax probability of error for any structured normal means problem, and we derive an optimality certificate for the maximum likelihood estimator, which can be applied to many instantiations. We also consider an experimental design setting, where we generalize our minimax bounds and derive an algorithm for computing a design strategy with a certain optimality property. We show that our results give tight minimax bounds for many structure recovery problems and consider some consequences for interactive sampling

    Bayesian analysis of multiple direct detection experiments

    Get PDF
    Bayesian methods offer a coherent and efficient framework for implementing uncertainties into induction problems. In this article, we review how this approach applies to the analysis of dark matter direct detection experiments. In particular we discuss the exclusion limit of XENON100 and the debated hints of detection under the hypothesis of a WIMP signal. Within parameter inference, marginalizing consistently over uncertainties to extract robust posterior probability distributions, we find that the claimed tension between XENON100 and the other experiments can be partially alleviated in isospin violating scenario, while elastic scattering model appears to be compatible with the frequentist statistical approach. We then move to model comparison, for which Bayesian methods are particularly well suited. Firstly, we investigate the annual modulation seen in CoGeNT data, finding that there is weak evidence for a modulation. Modulation models due to other physics compare unfavorably with the WIMP models, paying the price for their excessive complexity. Secondly, we confront several coherent scattering models to determine the current best physical scenario compatible with the experimental hints. We find that exothermic and inelastic dark matter are moderatly disfavored against the elastic scenario, while the isospin violating model has a similar evidence. Lastly the Bayes' factor gives inconclusive evidence for an incompatibility between the data sets of XENON100 and the hints of detection. The same question assessed with goodness of fit would indicate a 2 sigma discrepancy. This suggests that more data are therefore needed to settle this question.Comment: 29 pages, 8 figures; invited review for the special issue of the journal Physics of the Dark Universe; matches the published versio

    Inferring the intensity of Poisson processes at the limit of the detector sensitivity (with a case study on gravitational wave burst search)

    Get PDF
    We consider the issue of reporting the result of search experiment in the most unbiased and efficient way, i.e. in a way which allows an easy interpretation and combination of results and which do not depend on whether the experimenters believe or not to having found the searched-for effect. Since this work uses the language of Bayesian theory, to which most physicists are not used, we find that it could be useful to practitioners to have in a single paper a simple presentation of Bayesian inference, together with an example of application of it in search of rare processes.Comment: 36 pages, 11 figures, Latex files using cernart.cls (included). This paper and related work are also available at http://www-zeus.roma1.infn.it/~agostini/prob+stat.htm

    Incorporating prior knowledge improves detection of differences in bacterial growth rate

    Get PDF
    BACKGROUND: Robust statistical detection of differences in the bacterial growth rate can be challenging, particularly when dealing with small differences or noisy data. The Bayesian approach provides a consistent framework for inferring model parameters and comparing hypotheses. The method captures the full uncertainty of parameter values, whilst making effective use of prior knowledge about a given system to improve estimation. RESULTS: We demonstrated the application of Bayesian analysis to bacterial growth curve comparison. Following extensive testing of the method, the analysis was applied to the large dataset of bacterial responses which are freely available at the web-resource, ComBase. Detection was found to be improved by using prior knowledge from clusters of previously analysed experimental results at similar environmental conditions. A comparison was also made to a more traditional statistical testing method, the F-test, and Bayesian analysis was found to perform more conclusively and to be capable of attributing significance to more subtle differences in growth rate. CONCLUSIONS: We have demonstrated that by making use of existing experimental knowledge, it is possible to significantly improve detection of differences in bacterial growth rate

    Bayesian Inference in Processing Experimental Data: Principles and Basic Applications

    Full text link
    This report introduces general ideas and some basic methods of the Bayesian probability theory applied to physics measurements. Our aim is to make the reader familiar, through examples rather than rigorous formalism, with concepts such as: model comparison (including the automatic Ockham's Razor filter provided by the Bayesian approach); parametric inference; quantification of the uncertainty about the value of physical quantities, also taking into account systematic effects; role of marginalization; posterior characterization; predictive distributions; hierarchical modelling and hyperparameters; Gaussian approximation of the posterior and recovery of conventional methods, especially maximum likelihood and chi-square fits under well defined conditions; conjugate priors, transformation invariance and maximum entropy motivated priors; Monte Carlo estimates of expectation, including a short introduction to Markov Chain Monte Carlo methods.Comment: 40 pages, 2 figures, invited paper for Reports on Progress in Physic

    Evaluation methods and decision theory for classification of streaming data with temporal dependence

    Get PDF
    Predictive modeling on data streams plays an important role in modern data analysis, where data arrives continuously and needs to be mined in real time. In the stream setting the data distribution is often evolving over time, and models that update themselves during operation are becoming the state-of-the-art. This paper formalizes a learning and evaluation scheme of such predictive models. We theoretically analyze evaluation of classifiers on streaming data with temporal dependence. Our findings suggest that the commonly accepted data stream classification measures, such as classification accuracy and Kappa statistic, fail to diagnose cases of poor performance when temporal dependence is present, therefore they should not be used as sole performance indicators. Moreover, classification accuracy can be misleading if used as a proxy for evaluating change detectors with datasets that have temporal dependence. We formulate the decision theory for streaming data classification with temporal dependence and develop a new evaluation methodology for data stream classification that takes temporal dependence into account. We propose a combined measure for classification performance, that takes into account temporal dependence, and we recommend using it as the main performance measure in classification of streaming data
    corecore