90 research outputs found

    Computable lower bounds for deterministic parameter estimation

    Get PDF
    This paper is primarily tutorial in nature and presents a simple approach(norm minimization under linear constraints) for deriving computable lower bounds on the MSE of deterministic parameter estimators with a clear interpretation of the bounds. We also address the issue of lower bounds tightness in comparison with the MSE of ML estimators and their ability to predict the SNR threshold region. Last, as many practical estimation problems must be regarded as joint detection-estimation problems, we remind that the estimation performance must be conditional on detection performance, leading to the open problem of the fundamental limits of the joint detectionestimation performance

    On Lower Bounds for Non Standard Deterministic Estimation

    Get PDF
    We consider deterministic parameter estimation and the situation where the probability density function (p.d.f.) parameterized by unknown deterministic parameters results from the marginalization of a joint p.d.f. depending on random variables as well. Unfortunately, in the general case, this marginalization is mathematically intractable, which prevents from using the known standard deterministic lower bounds (LBs) on the mean squared error (MSE). Actually the general case can be tackled by embedding the initial observation space in a hybrid one where any standard LB can be transformed into a modified one fitted to nonstandard deterministic estimation, at the expense of tightness however. Furthermore, these modified LBs (MLBs) appears to include the submatrix of hybrid LBs which is an LB for the deterministic parameters. Moreover, since in the nonstandard estimation, maximum likelihood estimators (MLEs) can be no longer derived, suboptimal nonstandard MLEs (NSMLEs) are proposed as being a substitute. We show that any standard LB on the MSE of MLEs has a nonstandard version lower bounding the MSE of NSMLEs. We provide an analysis of the relative performance of the NSMLEs, as well as a comparison with the MLBs for a large class of estimation problems. Last, the general approach introduced is exemplified, among other things, with a new look at the well-known Gaussian complex observation models

    Hybrid Lower Bound On The MSE Based On The Barankin And Weiss-Weinstein Bounds

    No full text
    International audienceThis article investigates hybrid lower bounds in order to predict the estimators mean square error threshold effect. A tractable and computationally efficient form is derived. This form combines the Barankin and the Weiss-Weinstein bounds. This bound is applied to a frequency estimation problem for which a closed-form expression is provided. A comparison with results on the hybrid Barankin bound shows the superiority of this new bound to predict the mean square error threshold

    Statistics of the MLE and Approximate Upper and Lower Bounds - Part 1: Application to TOA Estimation

    Get PDF
    In nonlinear deterministic parameter estimation, the maximum likelihood estimator (MLE) is unable to attain the Cramer-Rao lower bound at low and medium signal-to-noise ratios (SNR) due the threshold and ambiguity phenomena. In order to evaluate the achieved mean-squared-error (MSE) at those SNR levels, we propose new MSE approximations (MSEA) and an approximate upper bound by using the method of interval estimation (MIE). The mean and the distribution of the MLE are approximated as well. The MIE consists in splitting the a priori domain of the unknown parameter into intervals and computing the statistics of the estimator in each interval. Also, we derive an approximate lower bound (ALB) based on the Taylor series expansion of noise and an ALB family by employing the binary detection principle. The accurateness of the proposed MSEAs and the tightness of the derived approximate bounds are validated by considering the example of time-of-arrival estimation

    Hierarchies of Frequentist Bounds for Quantum Metrology: From Cram\'er-Rao to Barankin

    Full text link
    We derive lower bounds on the variance of estimators in quantum metrology by choosing test observables that define constraints on the unbiasedness of the estimator. The quantum bounds are obtained by analytical optimization over all possible quantum measurements and estimators that satisfy the given constraints. We obtain hierarchies of increasingly tight bounds that include the quantum Cram\'er-Rao bound at the lowest order. In the opposite limit, the quantum Barankin bound is the variance of the locally best unbiased estimator in quantum metrology. Our results reveal generalizations of the quantum Fisher information that are able to avoid regularity conditions and identify threshold behavior in quantum measurements with mixed states, caused by finite data.Comment: 6+7 pages, 1+1 figure

    A Useful Form of the Abel Bound and Its Application to Estimator Threshold Prediction

    No full text
    International audienceThis correspondence investigates the Abel bound in order to predict the estimators mean square error (mse) threshold effect. A tractable and computationally efficient form of this bound is derived. This form combines the Chapman–Robbins and the CramĂ©r–Rao bounds. This bound is applied to a data-aided carrier frequency estimation problem for which a closed-form expression is provided. An indicator of the signal-to-noise ratio threshold is proposed. A comparison with recent results on the Barankin bound (Chapman–Robbins version) shows the superiority of the Abel-bound version to predict the mse threshold without increasing the computational complexity

    A Constrained Hybrid Cramér-Rao Bound for Parameter Estimation

    Get PDF
    In statistical signal processing, hybrid parameter estimation refers to the case where the parameters vector to estimate contains both non-random and random parameters. Numerous works have shown the versatility of deterministic constrained Cramér-Rao bound for estimation performance analysis and design of a system of measurement. However in many systems both random and non-random parameters may occur simultaneously. In this communication, we propose a constrained hybrid lower bound which take into account of equality constraint on deterministic parameters. The usefulness of the proposed bound is illustrated with an application to radar Doppler estimation

    A Fresh Look at the Bayesian Bounds of the Weiss-Weinstein Family

    No full text
    International audienceMinimal bounds on the mean square error (MSE) are generally used in order to predict the best achievable performance of an estimator for a given observation model. In this paper, we are interested in the Bayesian bound of the Weiss–Weinstein family. Among this family, we have Bayesian CramĂ©r-Rao bound, the Bobrovsky–MayerWolf–ZakaĂŻ bound, the Bayesian Bhattacharyya bound, the Bobrovsky–ZakaĂŻ bound, the Reuven–Messer bound, and the Weiss–Weinstein bound. We present a unification of all these minimal bounds based on a rewriting of the minimum mean square error estimator (MMSEE) and on a constrained optimization problem. With this approach, we obtain a useful theoretical framework to derive new Bayesian bounds. For that purpose, we propose two bounds. First, we propose a generalization of the Bayesian Bhattacharyya bound extending the works of Bobrovsky, Mayer–Wolf, and ZakaĂŻ. Second, we propose a bound based on the Bayesian Bhattacharyya bound and on the Reuven–Messer bound, representing a generalization of these bounds. The proposed bound is the Bayesian extension of the deterministic Abel bound and is found to be tighter than the Bayesian Bhattacharyya bound, the Reuven–Messer bound, the Bobrovsky–ZakaĂŻ bound, and the Bayesian CramĂ©r–Rao bound. We propose some closed-form expressions of these bounds for a general Gaussian observation model with parameterized mean. In order to illustrate our results, we present simulation results in the context of a spectral analysis problem

    Performance Bounds for Parameter Estimation under Misspecified Models: Fundamental findings and applications

    Full text link
    Inferring information from a set of acquired data is the main objective of any signal processing (SP) method. In particular, the common problem of estimating the value of a vector of parameters from a set of noisy measurements is at the core of a plethora of scientific and technological advances in the last decades; for example, wireless communications, radar and sonar, biomedicine, image processing, and seismology, just to name a few. Developing an estimation algorithm often begins by assuming a statistical model for the measured data, i.e. a probability density function (pdf) which if correct, fully characterizes the behaviour of the collected data/measurements. Experience with real data, however, often exposes the limitations of any assumed data model since modelling errors at some level are always present. Consequently, the true data model and the model assumed to derive the estimation algorithm could differ. When this happens, the model is said to be mismatched or misspecified. Therefore, understanding the possible performance loss or regret that an estimation algorithm could experience under model misspecification is of crucial importance for any SP practitioner. Further, understanding the limits on the performance of any estimator subject to model misspecification is of practical interest. Motivated by the widespread and practical need to assess the performance of a mismatched estimator, the goal of this paper is to help to bring attention to the main theoretical findings on estimation theory, and in particular on lower bounds under model misspecification, that have been published in the statistical and econometrical literature in the last fifty years. Secondly, some applications are discussed to illustrate the broad range of areas and problems to which this framework extends, and consequently the numerous opportunities available for SP researchers.Comment: To appear in the IEEE Signal Processing Magazin
    • 

    corecore