1,190 research outputs found

    Bayesian model comparison applied to the Explorer-Nautilus 2001 coincidence data

    Full text link
    Bayesian reasoning is applied to the data by the ROG Collaboration, in which gravitational wave (g.w.) signals are searched for in a coincidence experiment between Explorer and Nautilus. The use of Bayesian reasoning allows, under well defined hypotheses, even tiny pieces of evidence in favor of each model to be extracted from the data. The combination of the data of several experiments can therefore be performed in an optimal and efficient way. Some models for Galactic sources are considered and, within each model, the experimental result is summarized with the likelihood rescaled to the insensitivity limit value (``R{\cal R} function''). The model comparison result is given in in terms of Bayes factors, which quantify how the ratio of beliefs about two alternative models are modified by the experimental observationComment: 16 pages, 4 figures. Presented at the GWDAW2002 conference, held in Kyoto on Dec.,2002. This version includes comments by the referees of CQG, which has accepted the paper for pubblication in the special issue of the conference. In particular, note that in Eq. 12 there was a typeset error. As suggested by one of the referees, a uniform prior in Log(alpha) has also been considere

    The history of mass assembly of faint red galaxies in 28 galaxy clusters since z=1.3

    Full text link
    We measure the relative evolution of the number of bright and faint (as faint as 0.05 L*) red galaxies in a sample of 28 clusters, of which 16 are at 0.50<= z<=1.27, all observed through a pair of filters bracketing the 4000 Angstrom break rest-frame. The abundance of red galaxies, relative to bright ones, is constant over all the studied redshift range, 0<z<1.3, and rules out a differential evolution between bright and faint red galaxies as large as claimed in some past works. Faint red galaxies are largely assembled and in place at z=1.3 and their deficit does not depend on cluster mass, parametrized by velocity dispersion or X-ray luminosity. Our analysis, with respect to previous one, samples a wider redshift range, minimizes systematics and put a more attention to statistical issues, keeping at the same time a large number of clusters.Comment: MNRAS, 386, 1045. Half a single sentence (in sec 4.4) change

    Bayesian Inference in Processing Experimental Data: Principles and Basic Applications

    Full text link
    This report introduces general ideas and some basic methods of the Bayesian probability theory applied to physics measurements. Our aim is to make the reader familiar, through examples rather than rigorous formalism, with concepts such as: model comparison (including the automatic Ockham's Razor filter provided by the Bayesian approach); parametric inference; quantification of the uncertainty about the value of physical quantities, also taking into account systematic effects; role of marginalization; posterior characterization; predictive distributions; hierarchical modelling and hyperparameters; Gaussian approximation of the posterior and recovery of conventional methods, especially maximum likelihood and chi-square fits under well defined conditions; conjugate priors, transformation invariance and maximum entropy motivated priors; Monte Carlo estimates of expectation, including a short introduction to Markov Chain Monte Carlo methods.Comment: 40 pages, 2 figures, invited paper for Reports on Progress in Physic

    Neural Network Parametrization of Deep-Inelastic Structure Functions

    Full text link
    We construct a parametrization of deep-inelastic structure functions which retains information on experimental errors and correlations, and which does not introduce any theoretical bias while interpolating between existing data points. We generate a Monte Carlo sample of pseudo-data configurations and we train an ensemble of neural networks on them. This effectively provides us with a probability measure in the space of structure functions, within the whole kinematic region where data are available. This measure can then be used to determine the value of the structure function, its error, point-to-point correlations and generally the value and uncertainty of any function of the structure function itself. We apply this technique to the determination of the structure function F_2 of the proton and deuteron, and a precision determination of the isotriplet combination F_2[p-d]. We discuss in detail these results, check their stability and accuracy, and make them available in various formats for applications.Comment: Latex, 43 pages, 22 figures. (v2) Final version, published in JHEP; Sect.5.2 and Fig.9 improved, a few typos corrected and other minor improvements. (v3) Some inconsequential typos in Tab.1 and Tab 5 corrected. Neural parametrization available at http://sophia.ecm.ub.es/f2neura

    The Bjorken sum rule with Monte Carlo and Neural Network techniques

    Get PDF
    Determinations of structure functions and parton distribution functions have been recently obtained using Monte Carlo methods and neural networks as universal, unbiased interpolants for the unknown functional dependence. In this work the same methods are applied to obtain a parametrization of polarized Deep Inelastic Scattering (DIS) structure functions. The Monte Carlo approach provides a bias--free determination of the probability measure in the space of structure functions, while retaining all the information on experimental errors and correlations. In particular the error on the data is propagated into an error on the structure functions that has a clear statistical meaning. We present the application of this method to the parametrization from polarized DIS data of the photon asymmetries A1pA_1^p and A1dA_1^d from which we determine the structure functions g1p(x,Q2)g_1^p(x,Q^2) and g1d(x,Q2)g_1^d(x,Q^2), and discuss the possibility to extract physical parameters from these parametrizations. This work can be used as a starting point for the determination of polarized parton distributions.Comment: 24 pages, 6 figure

    Can Old Galaxies at High Redshifts and Baryon Acoustic Oscillations Constrain H_0?

    Full text link
    A new age-redshift test is proposed in order to constrain H0H_0 with basis on the existence of old high redshift galaxies (OHRG). As should be expected, the estimates of H0H_0 based on the OHRG are heavily dependent on the cosmological description. In the flat concordance model (Λ\LambdaCDM), for example, the value of H0H_0 depends on the mass density parameter ΩM=1ΩΛ\Omega_M=1 - \Omega_{\Lambda}. Such a degeneracy can be broken trough a joint analysis involving the OHRG and baryon acoustic oscillation (BAO) signature. In the framework of the ΛCDM\Lambda CDM model our joint analysis yields a value of H_0=71^{+4}_{-4}\kms Mpc1^{-1} (1σ1\sigma) with the best fit density parameter ΩM=0.27±0.03\Omega_M=0.27\pm0.03. Such results are in good agreement with independent studies from the {\it{Hubble Space Telescope}} key project and the recent estimates of WMAP, thereby suggesting that the combination of these two independent phenomena provides an interesting method to constrain the Hubble constant.Comment: 16 pages, 6 figures, 1 tabl

    Effects of age and gender on neural correlates of emotion imagery

    Get PDF
    Mental imagery is part of people's own internal processing and plays an important role in everyday life, cognition and pathology. The neural network supporting mental imagery is bottom-up modulated by the imagery content. Here, we examined the complex associations of gender and age with the neural mechanisms underlying emotion imagery. We assessed the brain circuits involved in emotion mental imagery (vs. action imagery), controlled by a letter detection task on the same stimuli, chosen to ensure attention to the stimuli and to discourage imagery, in 91 men and women aged 14–65 years using fMRI. In women, compared with men, emotion imagery significantly increased activation within the right putamen, which is involved in emotional processing. Increasing age, significantly decreased mental imagery-related activation in the left insula and cingulate cortex, areas involved in awareness of ones' internal states, and it significantly decreased emotion verbs-related activation in the left putamen, which is part of the limbic system. This finding suggests a top-down mechanism by which gender and age, in interaction with bottom-up effect of type of stimulus, or directly, can modulate the brain mechanisms underlying mental imagery

    Statistical coverage for supersymmetric parameter estimation: a case study with direct detection of dark matter

    Full text link
    Models of weak-scale supersymmetry offer viable dark matter (DM) candidates. Their parameter spaces are however rather large and complex, such that pinning down the actual parameter values from experimental data can depend strongly on the employed statistical framework and scanning algorithm. In frequentist parameter estimation, a central requirement for properly constructed confidence intervals is that they cover true parameter values, preferably at exactly the stated confidence level when experiments are repeated infinitely many times. Since most widely-used scanning techniques are optimised for Bayesian statistics, one needs to assess their abilities in providing correct confidence intervals in terms of the statistical coverage. Here we investigate this for the Constrained Minimal Supersymmetric Standard Model (CMSSM) when only constrained by data from direct searches for dark matter. We construct confidence intervals from one-dimensional profile likelihoods and study the coverage by generating several pseudo-experiments for a few benchmark sets of pseudo-true parameters. We use nested sampling to scan the parameter space and evaluate the coverage for the benchmarks when either flat or logarithmic priors are imposed on gaugino and scalar mass parameters. The sampling algorithm has been used in the configuration usually adopted for exploration of the Bayesian posterior. We observe both under- and over-coverage, which in some cases vary quite dramatically when benchmarks or priors are modified. We show how most of the variation can be explained as the impact of explicit priors as well as sampling effects, where the latter are indirectly imposed by physicality conditions. For comparison, we also evaluate the coverage for Bayesian credible intervals, and observe significant under-coverage in those cases.Comment: 30 pages, 5 figures; v2 includes major updates in response to referee's comments; extra scans and tables added, discussion expanded, typos corrected; matches published versio
    corecore