48,269 research outputs found
Testing quantum mechanics: a statistical approach
As experiments continue to push the quantum-classical boundary using
increasingly complex dynamical systems, the interpretation of experimental data
becomes more and more challenging: when the observations are noisy, indirect,
and limited, how can we be sure that we are observing quantum behavior? This
tutorial highlights some of the difficulties in such experimental tests of
quantum mechanics, using optomechanics as the central example, and discusses
how the issues can be resolved using techniques from statistics and insights
from quantum information theory.Comment: v1: 2 pages; v2: invited tutorial for Quantum Measurements and
Quantum Metrology, substantial expansion of v1, 19 pages; v3: accepted; v4:
corrected some errors, publishe
Consumption and Income Smoothing
This paper presents a two sector dynamic general equilibrium model in which income smoothing takes place within the households (intra-temporally), and consumption smoothing takes place among the households (inter-temporally). Idiosyncratic risk sharing within the family is based on an income smoothing contract. There are two sectors in the model, the regular sector and the underground sector, and the smoothing comes from the underground sector, which is countercyclical with respect aggregate GDP. The paper shows that the simulated disaggregated consumption and income series (that are the regular and underground consumption flows) are more sensitive to exogenous changes in sector-specific productivity and tax rates than regular and underground income flows, and that this picture is reversed when the aggregate series are considered.-
Regularized brain reading with shrinkage and smoothing
Functional neuroimaging measures how the brain responds to complex stimuli.
However, sample sizes are modest, noise is substantial, and stimuli are high
dimensional. Hence, direct estimates are inherently imprecise and call for
regularization. We compare a suite of approaches which regularize via
shrinkage: ridge regression, the elastic net (a generalization of ridge
regression and the lasso), and a hierarchical Bayesian model based on small
area estimation (SAE). We contrast regularization with spatial smoothing and
combinations of smoothing and shrinkage. All methods are tested on functional
magnetic resonance imaging (fMRI) data from multiple subjects participating in
two different experiments related to reading, for both predicting neural
response to stimuli and decoding stimuli from responses. Interestingly, when
the regularization parameters are chosen by cross-validation independently for
every voxel, low/high regularization is chosen in voxels where the
classification accuracy is high/low, indicating that the regularization
intensity is a good tool for identification of relevant voxels for the
cognitive task. Surprisingly, all the regularization methods work about equally
well, suggesting that beating basic smoothing and shrinkage will take not only
clever methods, but also careful modeling.Comment: Published at http://dx.doi.org/10.1214/15-AOAS837 in the Annals of
Applied Statistics (http://www.imstat.org/aoas/) by the Institute of
Mathematical Statistics (http://www.imstat.org
The theory and practice of interest rate smoothing
The interest rate policy of the Magyar Nemzeti Bank typically consists of taking several smaller steps in one direction. Other central banks follow similar practices. Their interest rate policy actions are characterised by gradual changes: in other words, they avoid sudden, major changes in interest rates and are wary of reversing interest rate cycles too frequently. This study will present the theoretical background of the practice of such interest rate smoothing, the motivations of central banks as revealed by their communication, and some important considerations for Hungarian monetary policy.interest rate smoothing, base rate, monetary policy.
Bibliographic Review on Distributed Kalman Filtering
In recent years, a compelling need has arisen to understand the effects of distributed information structures on estimation and filtering. In this paper, a bibliographical review on distributed Kalman filtering (DKF) is provided.\ud
The paper contains a classification of different approaches and methods involved to DKF. The applications of DKF are also discussed and explained separately. A comparison of different approaches is briefly carried out. Focuses on the contemporary research are also addressed with emphasis on the practical applications of the techniques. An exhaustive list of publications, linked directly or indirectly to DKF in the open literature, is compiled to provide an overall picture of different developing aspects of this area
Multiplicative local linear hazard estimation and best one-sided cross-validation
This paper develops detailed mathematical statistical theory of a new class of cross-validation techniques of local linear kernel hazards and their multiplicative bias corrections. The new class of cross-validation combines principles of local information and recent advances in indirect cross-validation. A few applications of cross-validating multiplicative kernel hazard estimation do exist in the literature. However, detailed mathematical statistical theory and small sample performance are introduced via this paper and further upgraded to our new class of best one-sided cross-validation. Best one-sided cross-validation turns out to have excellent performance in its practical illustrations, in its small sample performance and in its mathematical statistical theoretical performance
- âŠ