2,067,984 research outputs found

    Consistency of Bayesian procedures for variable selection

    Full text link
    It has long been known that for the comparison of pairwise nested models, a decision based on the Bayes factor produces a consistent model selector (in the frequentist sense). Here we go beyond the usual consistency for nested pairwise models, and show that for a wide class of prior distributions, including intrinsic priors, the corresponding Bayesian procedure for variable selection in normal regression is consistent in the entire class of normal linear models. We find that the asymptotics of the Bayes factors for intrinsic priors are equivalent to those of the Schwarz (BIC) criterion. Also, recall that the Jeffreys--Lindley paradox refers to the well-known fact that a point null hypothesis on the normal mean parameter is always accepted when the variance of the conjugate prior goes to infinity. This implies that some limiting forms of proper prior distributions are not necessarily suitable for testing problems. Intrinsic priors are limits of proper prior distributions, and for finite sample sizes they have been proved to behave extremely well for variable selection in regression; a consequence of our results is that for intrinsic priors Lindley's paradox does not arise.Comment: Published in at http://dx.doi.org/10.1214/08-AOS606 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Exact Post-Selection Inference for Sequential Regression Procedures

    Full text link
    We propose new inference tools for forward stepwise regression, least angle regression, and the lasso. Assuming a Gaussian model for the observation vector y, we first describe a general scheme to perform valid inference after any selection event that can be characterized as y falling into a polyhedral set. This framework allows us to derive conditional (post-selection) hypothesis tests at any step of forward stepwise or least angle regression, or any step along the lasso regularization path, because, as it turns out, selection events for these procedures can be expressed as polyhedral constraints on y. The p-values associated with these tests are exactly uniform under the null distribution, in finite samples, yielding exact type I error control. The tests can also be inverted to produce confidence intervals for appropriate underlying regression parameters. The R package "selectiveInference", freely available on the CRAN repository, implements the new inference tools described in this paper.Comment: 26 pages, 5 figure

    Selection and Evaluation of External Auditors - Policies and Procedures

    Get PDF

    Effects of Selection Systems on Job Search Decisions

    Get PDF
    On the basis of Gilliland\u27s (1993) model of selection system fairness, the present study investigated the relationships between selection procedures, perceived selection system fairness, and job search decisions in both hypothetical and actual organizations. We conducted two studies to test the model. In Study 1, we used an experimental method to examine job seekers\u27 perceptions of, and reactions to, five widely used selection procedures. Results suggested that applicants viewed employment interviews and cognitive ability tests as more job related than biographical inventories (biodata), personality tests, and drug tests, and that job relatedness significantly affected fairness perceptions, which in turn affected job search decisions. Study 2 examined the hypothesized relationships between the selection systems and job seekers\u27 pursuit of actual, relevant organizations. Results from both studies offer support for the hypothesized model, suggesting that selection tests have differential effects on perceived selection system validity and fairness, which affect subsequent job search decisions

    Improving selection stability of multiple testing procedures for fMRI

    Get PDF
    In search of an appropriate thresholding technique in the analysis of functional MRI-data, several methods to prevent an inflation of false positives have been proposed. Two popular (voxelwise) methods are the Bonferroni procedure (BF), which controls the familywise error rate (FWER), and the Benjamini-Hochberg procedure (BH), which controls the false discovery rate (FDR) (Benjamini & Hochberg 1995). Multiple testing procedures are typically evaluated on their average performance with respect to error rates, ignoring the aspect of variability. Resampling techniques allow to assess the selection variability of individual features (voxels). Following the approach of Gordon, Chen, Glazko & Yakovlev (2009) in the context of gene selection, we investigated whether variability on test results for BF and BH can be reduced by including both the significance and selection variability of the voxels in the decision criterion

    Selection Criteria for the Honors Program in Azerbaijan

    Get PDF
    Designing effective selection procedures for honors programs is always a challenging task. In Azerbaijan, selection is based on three main criteria: (i) student performance in the centralized university admission test; (ii) student performance in the first year of studies; and (iii) student performance in the honors program selection test. This research identifies criteria most crucial in predicting student success in honors programs. An analysis was first conducted for all honors students. Results indicate that all three criteria used in the selection process are highly significant predictors of student success in the program. This same analysis was then applied separately for each degree program, demonstrating that not all criteria are significant for some programs. These results suggest that creating differentiated selection procedures for different degree programs might be more efficient

    Variable selection in semiparametric regression modeling

    Full text link
    In this paper, we are concerned with how to select significant variables in semiparametric modeling. Variable selection for semiparametric regression models consists of two components: model selection for nonparametric components and selection of significant variables for the parametric portion. Thus, semiparametric variable selection is much more challenging than parametric variable selection (e.g., linear and generalized linear models) because traditional variable selection procedures including stepwise regression and the best subset selection now require separate model selection for the nonparametric components for each submodel. This leads to a very heavy computational burden. In this paper, we propose a class of variable selection procedures for semiparametric regression models using nonconcave penalized likelihood. We establish the rate of convergence of the resulting estimate. With proper choices of penalty functions and regularization parameters, we show the asymptotic normality of the resulting estimate and further demonstrate that the proposed procedures perform as well as an oracle procedure. A semiparametric generalized likelihood ratio test is proposed to select significant variables in the nonparametric component. We investigate the asymptotic behavior of the proposed test and demonstrate that its limiting null distribution follows a chi-square distribution which is independent of the nuisance parameters. Extensive Monte Carlo simulation studies are conducted to examine the finite sample performance of the proposed variable selection procedures.Comment: Published in at http://dx.doi.org/10.1214/009053607000000604 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Ordering Pareto-Optima Through Majority Voting

    Get PDF
    A commodity is shared between some individuals; some selection procedure is used to choose allocations. In order to reflect that laws and rules rather than allocations are implemented and that they involve an element of randomness because of incomplete information, selection procedures are taken to be probability measures over the set of allocations. Illustrations and interpretations of the selection procedures are given.Pareto-optimal allocations; infra-majority voting
    corecore