7 research outputs found
Double spike Dirichlet priors for structured weighting
Assigning weights to a large pool of objects is a fundamental task in a wide
variety of applications. In this article, we introduce a concept of structured
high-dimensional probability simplexes, whose most components are zero or near
zero and the remaining ones are close to each other. Such structure is well
motivated by 1) high-dimensional weights that are common in modern
applications, and 2) ubiquitous examples in which equal weights -- despite
their simplicity -- often achieve favorable or even state-of-the-art predictive
performances. This particular structure, however, presents unique challenges
both computationally and statistically. To address these challenges, we propose
a new class of double spike Dirichlet priors to shrink a probability simplex to
one with the desired structure. When applied to ensemble learning, such priors
lead to a Bayesian method for structured high-dimensional ensembles that is
useful for forecast combination and improving random forests, while enabling
uncertainty quantification. We design efficient Markov chain Monte Carlo
algorithms for easy implementation. Posterior contraction rates are established
to provide theoretical support. We demonstrate the wide applicability and
competitive performance of the proposed methods through simulations and two
real data applications using the European Central Bank Survey of Professional
Forecasters dataset and a UCI dataset