40 research outputs found

    Hybrid Lower Bound On The MSE Based On The Barankin And Weiss-Weinstein Bounds

    No full text
    International audienceThis article investigates hybrid lower bounds in order to predict the estimators mean square error threshold effect. A tractable and computationally efficient form is derived. This form combines the Barankin and the Weiss-Weinstein bounds. This bound is applied to a frequency estimation problem for which a closed-form expression is provided. A comparison with results on the hybrid Barankin bound shows the superiority of this new bound to predict the mean square error threshold

    A Fresh Look at the Bayesian Bounds of the Weiss-Weinstein Family

    No full text
    International audienceMinimal bounds on the mean square error (MSE) are generally used in order to predict the best achievable performance of an estimator for a given observation model. In this paper, we are interested in the Bayesian bound of the Weiss–Weinstein family. Among this family, we have Bayesian Cramér-Rao bound, the Bobrovsky–MayerWolf–Zakaï bound, the Bayesian Bhattacharyya bound, the Bobrovsky–Zakaï bound, the Reuven–Messer bound, and the Weiss–Weinstein bound. We present a unification of all these minimal bounds based on a rewriting of the minimum mean square error estimator (MMSEE) and on a constrained optimization problem. With this approach, we obtain a useful theoretical framework to derive new Bayesian bounds. For that purpose, we propose two bounds. First, we propose a generalization of the Bayesian Bhattacharyya bound extending the works of Bobrovsky, Mayer–Wolf, and Zakaï. Second, we propose a bound based on the Bayesian Bhattacharyya bound and on the Reuven–Messer bound, representing a generalization of these bounds. The proposed bound is the Bayesian extension of the deterministic Abel bound and is found to be tighter than the Bayesian Bhattacharyya bound, the Reuven–Messer bound, the Bobrovsky–Zakaï bound, and the Bayesian Cramér–Rao bound. We propose some closed-form expressions of these bounds for a general Gaussian observation model with parameterized mean. In order to illustrate our results, we present simulation results in the context of a spectral analysis problem

    Statistics of the MLE and Approximate Upper and Lower Bounds - Part 1: Application to TOA Estimation

    Get PDF
    In nonlinear deterministic parameter estimation, the maximum likelihood estimator (MLE) is unable to attain the Cramer-Rao lower bound at low and medium signal-to-noise ratios (SNR) due the threshold and ambiguity phenomena. In order to evaluate the achieved mean-squared-error (MSE) at those SNR levels, we propose new MSE approximations (MSEA) and an approximate upper bound by using the method of interval estimation (MIE). The mean and the distribution of the MLE are approximated as well. The MIE consists in splitting the a priori domain of the unknown parameter into intervals and computing the statistics of the estimator in each interval. Also, we derive an approximate lower bound (ALB) based on the Taylor series expansion of noise and an ALB family by employing the binary detection principle. The accurateness of the proposed MSEAs and the tightness of the derived approximate bounds are validated by considering the example of time-of-arrival estimation

    Some results on the Weiss-Weinstein bound for conditional and unconditional signal models in array processing

    No full text
    International audienceIn this paper, the Weiss-Weinstein bound is analyzed in the context of sources localization with a planar array of sensors. Both conditional and unconditional source signal models are studied. First, some results are given in the multiple sources context without specifying the structure of the steering matrix and of the noise covariance matrix. Moreover, the case of an uniform or Gaussian prior are analyzed. Second, these results are applied to the particular case of a single source for two kinds of array geometries: a non-uniform linear array (elevation only) and an arbitrary planar (azimuth and elevation) array

    The Bayesian ABEL Bound on the Mean Square Error

    No full text
    International audienceThis paper deals with lower bound on the Mean Square Error (MSE). In the Bayesian framework, we present a new bound which is derived from a constrained optimization problem. This bound is found to be tighter than the Bayesian Bhattacharyya bound, the Reuven-Messer bound, the Bobrovsky-Zakai bound, and the Bayesian Cramér-Rao bound

    On Lower Bounds for Non Standard Deterministic Estimation

    Get PDF
    We consider deterministic parameter estimation and the situation where the probability density function (p.d.f.) parameterized by unknown deterministic parameters results from the marginalization of a joint p.d.f. depending on random variables as well. Unfortunately, in the general case, this marginalization is mathematically intractable, which prevents from using the known standard deterministic lower bounds (LBs) on the mean squared error (MSE). Actually the general case can be tackled by embedding the initial observation space in a hybrid one where any standard LB can be transformed into a modified one fitted to nonstandard deterministic estimation, at the expense of tightness however. Furthermore, these modified LBs (MLBs) appears to include the submatrix of hybrid LBs which is an LB for the deterministic parameters. Moreover, since in the nonstandard estimation, maximum likelihood estimators (MLEs) can be no longer derived, suboptimal nonstandard MLEs (NSMLEs) are proposed as being a substitute. We show that any standard LB on the MSE of MLEs has a nonstandard version lower bounding the MSE of NSMLEs. We provide an analysis of the relative performance of the NSMLEs, as well as a comparison with the MLBs for a large class of estimation problems. Last, the general approach introduced is exemplified, among other things, with a new look at the well-known Gaussian complex observation models

    A NEW DERIVATION OF THE BAYESIAN BOUNDS FOR PARAMETER ESTIMATION

    No full text
    International audienceThis paper deals with minimal bounds in the Bayesian context. We express the minimum mean square error of the conditional mean estimator as the solution of a continuum constrained optimization problem. And, by relaxing these constraints, we obtain the bounds of the Weiss-Weinstein family. Moreover, this method enables us to derive new bounds as the Bayesian version of the deterministic Abel bound

    A Constrained Hybrid Cramér-Rao Bound for Parameter Estimation

    Get PDF
    In statistical signal processing, hybrid parameter estimation refers to the case where the parameters vector to estimate contains both non-random and random parameters. Numerous works have shown the versatility of deterministic constrained Cramér-Rao bound for estimation performance analysis and design of a system of measurement. However in many systems both random and non-random parameters may occur simultaneously. In this communication, we propose a constrained hybrid lower bound which take into account of equality constraint on deterministic parameters. The usefulness of the proposed bound is illustrated with an application to radar Doppler estimation

    Caractérisation des performances minimales d'estimation pour des modèles d'observations non-standards

    Get PDF
    In the parametric estimation context, estimators performances can be characterized, inter alia, by the mean square error and the resolution limit. The first quantities the accuracy of estimated values and the second defines the ability of the estimator to allow a correct resolvability. This thesis deals first with the prediction the "optimal" MSE by using lower bounds in the hybrid estimation context (i.e. when the parameter vector contains both random and non-random parameters), second with the extension of Cramér-Rao bounds for non-standard estimation problems and finally to the characterization of estimators resolution. This manuscript is then divided into three parts :First, we fill some lacks of hybrid lower bound on the MSE by using two existing Bayesian lower bounds: the Weiss-Weinstein bound and a particular form of Ziv-Zakai family lower bounds. We show that these extended lower bounds are tighter than the existing hybrid lower bounds in order to predict the optimal MSE.Second, we extend Cramer-Rao lower bounds for uncommon estimation contexts. Precisely: (i) Where the non-random parameters are subject to equality constraints (linear or nonlinear). (ii) For discrete-time filtering problems when the evolution of states are defined by a Markov chain. (iii) When the observation model differs to the real data distribution.Finally, we study the resolution of the estimators when their probability distributions are known. This approach is an extension of the work of Oh and Kashyap and the work of Clark to multi-dimensional parameters estimation problems.Dans le contexte de l'estimation paramétrique, les performances d'un estimateur peuvent être caractérisées, entre autre, par son erreur quadratique moyenne (EQM) et sa résolution limite. La première quantifie la précision des valeurs estimées et la seconde définit la capacité de l'estimateur à séparer plusieurs paramètres. Cette thèse s'intéresse d'abord à la prédiction de l'EQM "optimale" à l'aide des bornes inférieures pour des problèmes d'estimation simultanée de paramètres aléatoires et non-aléatoires (estimation hybride), puis à l'extension des bornes de Cramér-Rao pour des modèles d'observation moins standards. Enfin, la caractérisation des estimateurs en termes de résolution limite est également étudiée. Ce manuscrit est donc divisé en trois parties :Premièrement, nous complétons les résultats de littérature sur les bornes hybrides en utilisant deux bornes bayésiennes : la borne de Weiss-Weinstein et une forme particulière de la famille de bornes de Ziv-Zakaï. Nous montrons que ces bornes "étendues" sont plus précises pour la prédiction de l'EQM optimale par rapport à celles existantes dans la littérature.Deuxièmement, nous proposons des bornes de type Cramér-Rao pour des contextes d'estimation moins usuels, c'est-à-dire : (i) Lorsque les paramètres non-aléatoires sont soumis à des contraintes d'égalité linéaires ou non-linéaires (estimation sous contraintes). (ii) Pour des problèmes de filtrage à temps discret où l'évolution des états (paramètres) est régit par une chaîne de Markov. (iii) Lorsque la loi des observations est différente de la distribution réelle des données.Enfin, nous étudions la résolution et la précision des estimateurs en proposant un critère basé directement sur la distribution des estimées. Cette approche est une extension des travaux de Oh et Kashyap et de Clark pour des problèmes d'estimation de paramètres multidimensionnels
    corecore