1,512 research outputs found

    Almost the Best of Three Worlds: Risk, Consistency and Optional Stopping for the Switch Criterion in Nested Model Selection

    Get PDF
    We study the switch distribution, introduced by Van Erven et al. (2012), applied to model selection and subsequent estimation. While switching was known to be strongly consistent, here we show that it achieves minimax optimal parametric risk rates up to a log⁥log⁥n\log\log n factor when comparing two nested exponential families, partially confirming a conjecture by Lauritzen (2012) and Cavanaugh (2012) that switching behaves asymptotically like the Hannan-Quinn criterion. Moreover, like Bayes factor model selection but unlike standard significance testing, when one of the models represents a simple hypothesis, the switch criterion defines a robust null hypothesis test, meaning that its Type-I error probability can be bounded irrespective of the stopping rule. Hence, switching is consistent, insensitive to optional stopping and almost minimax risk optimal, showing that, Yang's (2005) impossibility result notwithstanding, it is possible to `almost' combine the strengths of AIC and Bayes factor model selection.Comment: To appear in Statistica Sinic

    Adaptive posterior contraction rates for the horseshoe

    Get PDF
    We investigate the frequentist properties of Bayesian procedures for estimation based on the horseshoe prior in the sparse multivariate normal means model. Previous theoretical results assumed that the sparsity level, that is, the number of signals, was known. We drop this assumption and characterize the behavior of the maximum marginal likelihood estimator (MMLE) of a key parameter of the horseshoe prior. We prove that the MMLE is an effective estimator of the sparsity level, in the sense that it leads to (near) minimax optimal estimation of the underlying mean vector generating the data. Besides this empirical Bayes procedure, we consider the hierarchical Bayes method of putting a prior on the unknown sparsity level as well. We show that both Bayesian techniques lead to rate-adaptive optimal posterior contraction, which implies that the horseshoe posterior is a good candidate for generating rate-adaptive credible sets.Comment: arXiv admin note: substantial text overlap with arXiv:1607.0189

    Migration Paradigm Shifts and Transformation of Migrant Communities: The Case of Dutch Kiwis

    Get PDF
    This paper explores the dynamics of Dutch community change in New Zealand since 1950. The Netherlands has been the largest source country of migrants from continental Europe to New Zealand, but by 2006 40 percent of the Netherlands born were aged 65 or older. We find that there are three distinct cohorts of these migrants, each covering roughly 20 years of arrivals: a large cohort of post-war migrants (those who arrived in the 1950s and 1960s), and much smaller cohorts of skilled migrants (those who arrived in the 1970s and 1980s), and transnational professionals (those who arrived in the 1990s or more recently). Early migrants were mostly younger arrival, more religious, less educated and had more children than the subsequent cohorts. More recent migrants are increasingly highly qualified and in high-skill occupations. "Dutch Kiwis" are more geographically dispersed than other immigrants, and more recent arrivals are relatively more often located in rural areas. This transformation of the Dutch community in New Zealand can be linked to global and New Zealand/Netherlands specific changes that have conditioned the character and volume of the migrant flows and the dynamics of migrant community development.globalisation, push and pull factors of migration, ageing of migrant communities, migrant integration, cohort analysis

    The Past, Present and Future of High Performance Computing

    Get PDF
    In this overview paper we start by looking at the birth of what is called ``High Performance Computing\u27\u27 today. It all began over 30 years ago when the Cray 1 and CDC Cyber 205 ``supercomputers\u27\u27 were introduced. This had a huge impact on scientific computing. A very turbulent time at both the hardware and software level was to follow. Eventually the situation stabilized, but not for long. Today, there are two different trends in hardware architectures and have created a bifurcation in the market. On one hand the GPGPU quickly found a place in the marketplace, but is still the domain of the expert. In contrast to this, multicore processors make hardware parallelism available to the masses. Each have their own set of issues to deal with. In the last section we make an attempt to look into the future, but this is of course a highly personal opinion
    • 

    corecore