7,769 research outputs found

    Vital Signs: Arts Funding in the Current Economy - The Outlook for Foundation Giving & Public Funding for the Arts: 2009 Update

    Get PDF
    The following analysis documents the Foundation Center's latest findings on how giving by the universe of more than 75,000 grantmaking U.S. independent, corporate, community, and operating foundations may fare in 2009 and 2010. It also compares how giving by foundations with a field of interest1 in arts and culture2 may fare relative to overall foundation giving. While the following analysis suggests that the current economic downturn may affect foundation giving for the arts, the data did not allow this analysis to estimate the overall levels of foundation giving for the arts in 2009 and beyond

    Optimal Estimation of Multidimensional Normal Means With an Unknown Variance

    Get PDF
    Let X ∼ Np(θ, σ2 Ip) and W ∼ σ2 χ2m, where both θ and σ2 are unknown, and X is independent of W. Optimal estimation of θ with unknown σ2 is a fundamental issue in applications but basic theoretical issues remain open. We consider estimation of θ under squared error loss. We develop sufficient conditions for prior density functions such that the corresponding generalized Bayes estimators for θ are admissible. This paper has a two-fold purpose: 1. Provide a benchmark for the evaluation of shrinkage estimation for a multivariate normal mean with an unknown variance; 2. Use admissibility as a criterion to select priors for hierarchical Bayes models. To illustrate how to select hierarchical priors, we apply these sufficient conditions to a widely used hierarchical Bayes model proposed by Maruyama & Strawderman [M-S] (2005), and obtain a class of admissible and minimax generalized Bayes estimators for the normal mean θ. We also develop necessary conditions for admissibility of generalized Bayes estimators in the M-S (2005) hierarchical Bayes model. All the results in this paper can be directly applied in the familiar setting of Gaussian linear regression

    Hunting for Significance: Bayesian Classifiers Under a Mixture Loss Function

    Get PDF
    Detecting significance in a high-dimensional sparse data structure has received a large amount of attention in modern statistics. In the current paper, we introduce a compound decision rule to simultaneously classify signals from noise. This procedure is a Bayes rule subject to a mixture loss function. The loss function minimizes the number of false discoveries while controlling the false nondiscoveries by incorporating the signal strength information. Based on our criterion, strong signals will be penalized more heavily for nondiscovery than weak signals. In constructing this classification rule, we assume a mixture prior for the parameter which adapts to the unknown sparsity. This Bayes rule can be viewed as thresholding the “local fdr” (Efron, 2007) by adaptive thresholds. Both parametric and nonparametric methods will be discussed. The nonparametric procedure adapts to the unknown data structure well and outperforms the parametric one. Performance of the procedure is illustrated by various simulation studies and a real data application
    corecore