1,168 research outputs found

    What's the big idea? CramĂ©r–Rao inequality and Rao distance

    Get PDF
    Angel Ricardo Plastino and Angelo Plastino give a brief introduction to two key developments in statistics that originate with C. R. Rao's 1945 paper, “Information and accuracy attainable in the estimation of statistical parameters”.Fil: Plastino, Ángel Ricardo. Universidad Nacional del Noroeste de la Provincia de Buenos Aires. Centro de Bioinvestigaciones (Sede Pergamino); Argentina. Consejo Nacional de Investigaciones CientĂ­ficas y TĂ©cnicas; ArgentinaFil: Plastino, Ángel Ricardo. Consejo Nacional de Investigaciones CientĂ­ficas y TĂ©cnicas. Centro CientĂ­fico TecnolĂłgico Conicet - La Plata. Instituto de FĂ­sica La Plata. Universidad Nacional de La Plata. Facultad de Ciencias Exactas. Instituto de FĂ­sica La Plata; Argentin

    On some interrelations of generalized qq-entropies and a generalized Fisher information, including a Cram\'er-Rao inequality

    Get PDF
    In this communication, we describe some interrelations between generalized qq-entropies and a generalized version of Fisher information. In information theory, the de Bruijn identity links the Fisher information and the derivative of the entropy. We show that this identity can be extended to generalized versions of entropy and Fisher information. More precisely, a generalized Fisher information naturally pops up in the expression of the derivative of the Tsallis entropy. This generalized Fisher information also appears as a special case of a generalized Fisher information for estimation problems. Indeed, we derive here a new Cram\'er-Rao inequality for the estimation of a parameter, which involves a generalized form of Fisher information. This generalized Fisher information reduces to the standard Fisher information as a particular case. In the case of a translation parameter, the general Cram\'er-Rao inequality leads to an inequality for distributions which is saturated by generalized qq-Gaussian distributions. These generalized qq-Gaussians are important in several areas of physics and mathematics. They are known to maximize the qq-entropies subject to a moment constraint. The Cram\'er-Rao inequality shows that the generalized qq-Gaussians also minimize the generalized Fisher information among distributions with a fixed moment. Similarly, the generalized qq-Gaussians also minimize the generalized Fisher information among distributions with a given qq-entropy

    Lower Bounds on Exponential Moments of the Quadratic Error in Parameter Estimation

    Full text link
    Considering the problem of risk-sensitive parameter estimation, we propose a fairly wide family of lower bounds on the exponential moments of the quadratic error, both in the Bayesian and the non--Bayesian regime. This family of bounds, which is based on a change of measures, offers considerable freedom in the choice of the reference measure, and our efforts are devoted to explore this freedom to a certain extent. Our focus is mostly on signal models that are relevant to communication problems, namely, models of a parameter-dependent signal (modulated signal) corrupted by additive white Gaussian noise, but the methodology proposed is also applicable to other types of parametric families, such as models of linear systems driven by random input signals (white noise, in most cases), and others. In addition to the well known motivations of the risk-sensitive cost function (i.e., the exponential quadratic cost function), which is most notably, the robustness to model uncertainty, we also view this cost function as a tool for studying fundamental limits concerning the tail behavior of the estimation error. Another interesting aspect, that we demonstrate in a certain parametric model, is that the risk-sensitive cost function may be subjected to phase transitions, owing to some analogies with statistical mechanics.Comment: 28 pages; 4 figures; submitted for publicatio

    Performance Bounds for Parameter Estimation under Misspecified Models: Fundamental findings and applications

    Full text link
    Inferring information from a set of acquired data is the main objective of any signal processing (SP) method. In particular, the common problem of estimating the value of a vector of parameters from a set of noisy measurements is at the core of a plethora of scientific and technological advances in the last decades; for example, wireless communications, radar and sonar, biomedicine, image processing, and seismology, just to name a few. Developing an estimation algorithm often begins by assuming a statistical model for the measured data, i.e. a probability density function (pdf) which if correct, fully characterizes the behaviour of the collected data/measurements. Experience with real data, however, often exposes the limitations of any assumed data model since modelling errors at some level are always present. Consequently, the true data model and the model assumed to derive the estimation algorithm could differ. When this happens, the model is said to be mismatched or misspecified. Therefore, understanding the possible performance loss or regret that an estimation algorithm could experience under model misspecification is of crucial importance for any SP practitioner. Further, understanding the limits on the performance of any estimator subject to model misspecification is of practical interest. Motivated by the widespread and practical need to assess the performance of a mismatched estimator, the goal of this paper is to help to bring attention to the main theoretical findings on estimation theory, and in particular on lower bounds under model misspecification, that have been published in the statistical and econometrical literature in the last fifty years. Secondly, some applications are discussed to illustrate the broad range of areas and problems to which this framework extends, and consequently the numerous opportunities available for SP researchers.Comment: To appear in the IEEE Signal Processing Magazin

    Tighter quantum uncertainty relations follow from a general probabilistic bound

    Get PDF
    Uncertainty relations (URs) like the Heisenberg-Robertson or the time-energy UR are often considered to be hallmarks of quantum theory. Here, a simple derivation of these URs is presented based on a single classical inequality from estimation theory, a Cram\'er-Rao-like bound. The Heisenberg-Robertson UR is then obtained by using the Born rule and the Schr\"odinger equation. This allows a clear separtion of the probabilistic nature of quantum mechanics from the Hilbert space structure and the dynamical law. It also simplifies the interpretation of the bound. In addition, the Heisenberg-Robertson UR is tightened for mixed states by replacing one variance by the so-called quantum Fisher information. Thermal states of Hamiltonians with evenly-gapped energy levels are shown to saturate the tighter bound for natural choices of the operators. This example is further extended to Gaussian states of a harmonic oscillator. For many-qubit systems, we illustrate the interplay between entanglement and the structure of the operators that saturate the UR with spin-squeezed states and Dicke states.Comment: 8 pages, 1 figure. v2: improved presentation, references added, results on the connection between saturated inequality and entanglement structure for multi-qubit states adde

    Analysis of the Bayesian Cramer-Rao lower bound in astrometry: Studying the impact of prior information in the location of an object

    Full text link
    Context. The best precision that can be achieved to estimate the location of a stellar-like object is a topic of permanent interest in the astrometric community. Aims. We analyse bounds for the best position estimation of a stellar-like object on a CCD detector array in a Bayesian setting where the position is unknown, but where we have access to a prior distribution. In contrast to a parametric setting where we estimate a parameter from observations, the Bayesian approach estimates a random object (i.e., the position is a random variable) from observations that are statistically dependent on the position. Methods. We characterize the Bayesian Cramer-Rao (CR) that bounds the minimum mean square error (MMSE) of the best estimator of the position of a point source on a linear CCD-like detector, as a function of the properties of detector, the source, and the background. Results. We quantify and analyse the increase in astrometric performance from the use of a prior distribution of the object position, which is not available in the classical parametric setting. This gain is shown to be significant for various observational regimes, in particular in the case of faint objects or when the observations are taken under poor conditions. Furthermore, we present numerical evidence that the MMSE estimator of this problem tightly achieves the Bayesian CR bound. This is a remarkable result, demonstrating that all the performance gains presented in our analysis can be achieved with the MMSE estimator. Conclusions The Bayesian CR bound can be used as a benchmark indicator of the expected maximum positional precision of a set of astrometric measurements in which prior information can be incorporated. This bound can be achieved through the conditional mean estimator, in contrast to the parametric case where no unbiased estimator precisely reaches the CR bound.Comment: 17 pages, 12 figures. Accepted for publication on Astronomy & Astrophysic
    • 

    corecore