13 research outputs found

    A powerful test based on tapering for use in functional data analysis

    Full text link
    A test based on tapering is proposed for use in testing a global linear hypothesis under a functional linear model. The test statistic is constructed as a weighted sum of squared linear combinations of Fourier coefficients, a tapered quadratic form, in which higher Fourier frequencies are down-weighted so as to emphasize the smooth attributes of the model. A formula is QnOPT=n∑j=1pnj−1/2∥Yn,j∥2Q_n^{OPT}=n\sum_{j=1}^{p_n}j^{-1/2}\|\boldsymbol{Y}_{n,j}\|^2. Down-weighting by j−1/2j^{-1/2} is selected to achieve adaptive optimality among tests based on tapering with respect to its ``rates of testing,'' an asymptotic framework for measuring a test's retention of power in high dimensions under smoothness constraints. Existing tests based on truncation or thresholding are known to have superior asymptotic power in comparison with any test based on tapering; however, it is shown here that high-order effects can be substantial, and that a test based on QnOPTQ_n^{OPT} exhibits better (non-asymptotic) power against the sort of alternatives that would typically be of concern in functional data analysis applications. The proposed test is developed for use in practice, and demonstrated in an example application.Comment: Published in at http://dx.doi.org/10.1214/08-EJS172 the Electronic Journal of Statistics (http://www.i-journals.org/ejs/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Subjective Bayesian testing using calibrated prior probabilities

    Get PDF
    This article proposes a calibration scheme for Bayesian testing that coordinates analytically-derived statistical performance considerations with expert opinion. In other words, the scheme is effective and meaningful for incorporating objective elements into subjective Bayesian inference. It explores a novel role for default priors as anchors for calibration rather than substitutes for prior knowledge. Ideas are developed for use with multiplicity adjustments in multiple-model contexts, and to address the issue of prior sensitivity of Bayes factors. Along the way, the performance properties of an existing multiplicity adjustment related to the Poisson distribution are clarified theoretically. Connections of the overall calibration scheme to the Schwarz criterion are also explored. The proposed framework is examined and illustrated on a number of existing data sets related to problems in clinical trials, forensic pattern matching, and log-linear models methodology.This is a manuscript of an article published as Spitzner, Dan J. "Subjective Bayesian testing using calibrated prior probabilities." Brazilian Journal of Probability and Statistics 33, no. 4 (2019): 861-893. Posted with permission of CSAFE.</p

    Risk-reducing shrinkage estimation for generalized linear models

    No full text
    Empirical Bayes techniques for normal theory shrinkage estimation are extended to generalized linear models in a manner retaining the original spirit of shrinkage estimation, which is to reduce risk. The investigation identifies two classes of simple, all-purpose prior distributions, which supplement such non-informative priors as Jeffreys's prior with mechanisms for risk reduction. One new class of priors is motivated as optimizers of a core component of asymptotic risk. The methodology is evaluated in a numerical exploration and application to an existing data set. Copyright 2005 Royal Statistical Society.

    Subjective Bayesian testing using calibrated prior probabilities

    No full text

    Threshold Dependence Of Mortality Effects For Fine And Coarse Particles In Phoenix, Arizona

    No full text
    Daily data for ne (&lt; 2.5g/m 3 ) and coarse (2.5-10g/m 3 ) particles are available for 1995-7 from the EPA research monitor in Phoenix. Mortality eects on the 65 and over population are studied for both Phoenix city and for a region of about 50 miles around Phoenix. Coarse particles in Phoenix are believed to be natural in origin and spatially homogeneous, whereas ne particles are primarily vehicular in origin and concentrated in the city itself. For this reason, it is natural to focus on the city mortality data when considering ne particles, and on the region mortality data when considering coarse particles, and most of the results reported here correspond to those assignments. After allowing for seasonality and long-term trend through a nonlinear (B-spline) trend curve, and also for meteorological eects based on temperature and specic humidity, a regression was performed of mortality on PM, using several dierent measures for PM. Based on a linear PM eect, we nd a statist..
    corecore