257,805 research outputs found

    Measurements of muon multiple scattering in MICE

    Get PDF
    Neutrino factories have been identified as the best facility for making precision measurements of neutrino oscillation physics. To fully realize this technology, a demonstration of the reduction of the phase space of a muon beam must be presented. The Muon Ionization Cooling Experiment (MICE) is tasked with providing such a demonstration. Ionization cooling uses the energy loss in a low Z material followed by acceleration in RF cavities to reduce the phase space of a beam on a time scale many times less than the time scale of muon decay. Multiple coulomb scattering (MCS) simultaneously inflates the muon beam and so the interplay between energy loss and MCS must be well understood. Unfortunately MCS is not well simulated in the materials of interest in the GEANT Monte Carlo program. A programme has commenced for MICE to measure MCS in several materials of interest including lithium hydride, liquid hydrogen, and gaseous xenon. The experimental methods and early results will be presented

    Bayesian Method of Moments (BMOM) Analysis of Mean and Regression Models

    Full text link
    A Bayesian method of moments/instrumental variable (BMOM/IV) approach is developed and applied in the analysis of the important mean and multiple regression models. Given a single set of data, it is shown how to obtain posterior and predictive moments without the use of likelihood functions, prior densities and Bayes' Theorem. The posterior and predictive moments, based on a few relatively weak assumptions, are then used to obtain maximum entropy densities for parameters, realized error terms and future values of variables. Posterior means for parameters and realized error terms are shown to be equal to certain well known estimates and rationalized in terms of quadratic loss functions. Conditional maxent posterior densities for means and regression coefficients given scale parameters are in the normal form while scale parameters' maxent densities are in the exponential form. Marginal densities for individual regression coefficients, realized error terms and future values are in the Laplace or double-exponential form with heavier tails than normal densities with the same means and variances. It is concluded that these results will be very useful, particularly when there is difficulty in formulating appropriate likelihood functions and prior densities needed in traditional maximum likelihood and Bayesian approaches.Comment: 14 pages, postscript and pdf forma

    Bayes and empirical Bayes: do they merge?

    Full text link
    Bayesian inference is attractive for its coherence and good frequentist properties. However, it is a common experience that eliciting a honest prior may be difficult and, in practice, people often take an {\em empirical Bayes} approach, plugging empirical estimates of the prior hyperparameters into the posterior distribution. Even if not rigorously justified, the underlying idea is that, when the sample size is large, empirical Bayes leads to "similar" inferential answers. Yet, precise mathematical results seem to be missing. In this work, we give a more rigorous justification in terms of merging of Bayes and empirical Bayes posterior distributions. We consider two notions of merging: Bayesian weak merging and frequentist merging in total variation. Since weak merging is related to consistency, we provide sufficient conditions for consistency of empirical Bayes posteriors. Also, we show that, under regularity conditions, the empirical Bayes procedure asymptotically selects the value of the hyperparameter for which the prior mostly favors the "truth". Examples include empirical Bayes density estimation with Dirichlet process mixtures.Comment: 27 page
    • …
    corecore