5,936 research outputs found

    Lower Bounds on Exponential Moments of the Quadratic Error in Parameter Estimation

    Full text link
    Considering the problem of risk-sensitive parameter estimation, we propose a fairly wide family of lower bounds on the exponential moments of the quadratic error, both in the Bayesian and the non--Bayesian regime. This family of bounds, which is based on a change of measures, offers considerable freedom in the choice of the reference measure, and our efforts are devoted to explore this freedom to a certain extent. Our focus is mostly on signal models that are relevant to communication problems, namely, models of a parameter-dependent signal (modulated signal) corrupted by additive white Gaussian noise, but the methodology proposed is also applicable to other types of parametric families, such as models of linear systems driven by random input signals (white noise, in most cases), and others. In addition to the well known motivations of the risk-sensitive cost function (i.e., the exponential quadratic cost function), which is most notably, the robustness to model uncertainty, we also view this cost function as a tool for studying fundamental limits concerning the tail behavior of the estimation error. Another interesting aspect, that we demonstrate in a certain parametric model, is that the risk-sensitive cost function may be subjected to phase transitions, owing to some analogies with statistical mechanics.Comment: 28 pages; 4 figures; submitted for publicatio

    Meta learning of bounds on the Bayes classifier error

    Full text link
    Meta learning uses information from base learners (e.g. classifiers or estimators) as well as information about the learning problem to improve upon the performance of a single base learner. For example, the Bayes error rate of a given feature space, if known, can be used to aid in choosing a classifier, as well as in feature selection and model selection for the base classifiers and the meta classifier. Recent work in the field of f-divergence functional estimation has led to the development of simple and rapidly converging estimators that can be used to estimate various bounds on the Bayes error. We estimate multiple bounds on the Bayes error using an estimator that applies meta learning to slowly converging plug-in estimators to obtain the parametric convergence rate. We compare the estimated bounds empirically on simulated data and then estimate the tighter bounds on features extracted from an image patch analysis of sunspot continuum and magnetogram images.Comment: 6 pages, 3 figures, to appear in proceedings of 2015 IEEE Signal Processing and SP Education Worksho

    Conservative classical and quantum resolution limits for incoherent imaging

    Full text link
    I propose classical and quantum limits to the statistical resolution of two incoherent optical point sources from the perspective of minimax parameter estimation. Unlike earlier results based on the Cram\'er-Rao bound, the limits proposed here, based on the worst-case error criterion and a Bayesian version of the Cram\'er-Rao bound, are valid for any biased or unbiased estimator and obey photon-number scalings that are consistent with the behaviors of actual estimators. These results prove that, from the minimax perspective, the spatial-mode demultiplexing (SPADE) measurement scheme recently proposed by Tsang, Nair, and Lu [Phys. Rev. X 6, 031033 (2016)] remains superior to direct imaging for sufficiently high photon numbers.Comment: 12 pages, 2 figures. v2: focused on imaging, cleaned up the math, added new analytic and numerical results. v3: restructured and submitte
    corecore