64 research outputs found

    A NEW DERIVATION OF THE BAYESIAN BOUNDS FOR PARAMETER ESTIMATION

    No full text
    International audienceThis paper deals with minimal bounds in the Bayesian context. We express the minimum mean square error of the conditional mean estimator as the solution of a continuum constrained optimization problem. And, by relaxing these constraints, we obtain the bounds of the Weiss-Weinstein family. Moreover, this method enables us to derive new bounds as the Bayesian version of the deterministic Abel bound

    NON ASYMPTOTIC EFFICIENCY OF A MAXIMUM LIKELIHOOD ESTIMATOR AT FINITE NUMBER OF SAMPLES

    No full text
    International audienceIn estimation theory, the asymptotic (in the number of samples) efficiency of the Maximum Likelihood (ML) estimator is a well known result [1]. Nevertheless, in some scenarios, the number of snapshots may be small. We recently investigated the asymptotic behavior of the Stochastic ML (SML) estimator at high Signal to Noise Ratio (SNR) and finite number of samples [2] in the array processing framework: we proved the non-Gaussiannity of the SML estimator and we obtained the analytical expression of the variance for the single source case. In this paper, we generalize these results to multiple sources, and we obtain variance expressions which demonstrate the non-efficiency of SML estimates

    CRLB under K-distributed observation with parameterized mean

    No full text
    International audienceA semi closed-form expression of the Fisher information matrix in the context of K-distributed observations with parameterized mean is given and related to the classical, i.e. Gaussian case. This connection is done via a simple multiplicative factor, which only depends on the intrinsic parameters of the texture and the size of the observation vector. Finally, numerical simulation is provided to corroborate the theoretical analysi

    Non-efficacité et non-gaussianité asymptotiques d'un estimateur du maximum de vraisemblance à fort rapport signal sur bruit

    No full text
    National audienceEn théorie de l'estimation, dans le cas d'observations indépendantes de mêmes densités de probabilité, l'efficacité asymptotique en le nombre T d'observations de la méthode du Maximum de Vraisemblance (MV) est un résultat bien connu qui permet d'appréhender ses performances lorsque T est grand. Dans certaines situations, le nombre d'observations peut être faible et ce résultat ne s'applique plus. Dans le cadre du traitement d'antenne et d'une modélisation stochastique des signaux émis par les sources, nous remédions à cette lacune lorsque le Rapport Signal sur Bruit (RSB) est grand. Nous montrons que dans cette situation, l'estimateur du MV est asymptotiquement (en RSB) non-efficace et non-gaussien

    Unconditional maximum likelihood performance at finite number of samples and high signal-to-noise ratio

    No full text
    International audienceThis correspondence deals with the problem of estimating signal parameters using an array of sensors. In source localization, two main maximum-likelihood methods have been introduced: the conditional maximum-likelihood method which assumes the source signals nonrandom and the unconditional maximum-likelihood method which assumes the source signals random. Many theoretical investigations have been already conducted for the large samples statistical properties. This correspondence studies the behavior of unconditional maximum likelihood at high signal-to-noise ratio for finite samples. We first establish the equivalence between the unconditional and the conditional maximum-likelihood criterions at high signal-to-noise ratio. Then, thanks to this equivalence we prove the non-Gaussianity and the non-efficiency of the unconditional maximum-likelihood estimator. We also rediscover the closed-form expressions of the probability density function and of the variance of the estimates in the one source scenario and we derive a closed-form expression of this estimator variance in the two sources scenario

    The Bayesian ABEL Bound on the Mean Square Error

    No full text
    International audienceThis paper deals with lower bound on the Mean Square Error (MSE). In the Bayesian framework, we present a new bound which is derived from a constrained optimization problem. This bound is found to be tighter than the Bayesian Bhattacharyya bound, the Reuven-Messer bound, the Bobrovsky-Zakai bound, and the Bayesian Cramér-Rao bound

    On the high-SNR conditional maximum-likelihood estimator full statistical characterization

    No full text
    International audienceIn the field of asymptotic performance characterization of the conditional maximum-likelihood (CML) estimator, asymptotic generally refers to either the number of samples or the signal-to-noise ratio (SNR) value. The first case has been already fully characterized, although the second case has been only partially investigated. Therefore, this correspondence aims to provide a sound proof of a result, i.e., asymptotic (in SNR) Gaussianity and efficiency of the CML estimator in the multiple parameters case, generally regarded as trivial but not so far demonstrated

    A Useful Form of the Abel Bound and Its Application to Estimator Threshold Prediction

    No full text
    International audienceThis correspondence investigates the Abel bound in order to predict the estimators mean square error (mse) threshold effect. A tractable and computationally efficient form of this bound is derived. This form combines the Chapman–Robbins and the Cramér–Rao bounds. This bound is applied to a data-aided carrier frequency estimation problem for which a closed-form expression is provided. An indicator of the signal-to-noise ratio threshold is proposed. A comparison with recent results on the Barankin bound (Chapman–Robbins version) shows the superiority of the Abel-bound version to predict the mse threshold without increasing the computational complexity

    A Fresh Look at the Bayesian Bounds of the Weiss-Weinstein Family

    No full text
    International audienceMinimal bounds on the mean square error (MSE) are generally used in order to predict the best achievable performance of an estimator for a given observation model. In this paper, we are interested in the Bayesian bound of the Weiss–Weinstein family. Among this family, we have Bayesian Cramér-Rao bound, the Bobrovsky–MayerWolf–Zakaï bound, the Bayesian Bhattacharyya bound, the Bobrovsky–Zakaï bound, the Reuven–Messer bound, and the Weiss–Weinstein bound. We present a unification of all these minimal bounds based on a rewriting of the minimum mean square error estimator (MMSEE) and on a constrained optimization problem. With this approach, we obtain a useful theoretical framework to derive new Bayesian bounds. For that purpose, we propose two bounds. First, we propose a generalization of the Bayesian Bhattacharyya bound extending the works of Bobrovsky, Mayer–Wolf, and Zakaï. Second, we propose a bound based on the Bayesian Bhattacharyya bound and on the Reuven–Messer bound, representing a generalization of these bounds. The proposed bound is the Bayesian extension of the deterministic Abel bound and is found to be tighter than the Bayesian Bhattacharyya bound, the Reuven–Messer bound, the Bobrovsky–Zakaï bound, and the Bayesian Cramér–Rao bound. We propose some closed-form expressions of these bounds for a general Gaussian observation model with parameterized mean. In order to illustrate our results, we present simulation results in the context of a spectral analysis problem

    Une nouvelle approche des bornes Bayésiennes

    Get PDF
    - Ce papier traite des bornes minimales de l'erreur quadratique moyenne dans un cadre Bayésien. Nous exprimons l'erreur quadratique moyenne de l'estimateur de la moyenne conditionnelle, qui est la meilleure borne Bayésienne, sous la forme d'un problème d'optimisation sous un continuum de contraintes. La disctrétisation de ce continuum conduit à une méthode permettant l'unification des bornes classiques. En outre, l'introduction de nouvelles contraintes permet d'obtenir des bornes jusqu'à lors inexplorées. C'est ainsi que cette approche nous permet d'élaborer une version Bayésienne de la borne d'Abel
    • …
    corecore