32 research outputs found

    Statistical analysis of adaptive maximum-likelihood signal estimator

    Get PDF
    Thesis (Elec. E.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1995.Includes bibliographical references (leaves 56-57).by Christ D. Richmond.Elec.E

    The Bayesian ABEL Bound on the Mean Square Error

    No full text
    International audienceThis paper deals with lower bound on the Mean Square Error (MSE). In the Bayesian framework, we present a new bound which is derived from a constrained optimization problem. This bound is found to be tighter than the Bayesian Bhattacharyya bound, the Reuven-Messer bound, the Bobrovsky-Zakai bound, and the Bayesian Cramér-Rao bound

    A Fresh Look at the Bayesian Bounds of the Weiss-Weinstein Family

    No full text
    International audienceMinimal bounds on the mean square error (MSE) are generally used in order to predict the best achievable performance of an estimator for a given observation model. In this paper, we are interested in the Bayesian bound of the Weiss–Weinstein family. Among this family, we have Bayesian Cramér-Rao bound, the Bobrovsky–MayerWolf–Zakaï bound, the Bayesian Bhattacharyya bound, the Bobrovsky–Zakaï bound, the Reuven–Messer bound, and the Weiss–Weinstein bound. We present a unification of all these minimal bounds based on a rewriting of the minimum mean square error estimator (MMSEE) and on a constrained optimization problem. With this approach, we obtain a useful theoretical framework to derive new Bayesian bounds. For that purpose, we propose two bounds. First, we propose a generalization of the Bayesian Bhattacharyya bound extending the works of Bobrovsky, Mayer–Wolf, and Zakaï. Second, we propose a bound based on the Bayesian Bhattacharyya bound and on the Reuven–Messer bound, representing a generalization of these bounds. The proposed bound is the Bayesian extension of the deterministic Abel bound and is found to be tighter than the Bayesian Bhattacharyya bound, the Reuven–Messer bound, the Bobrovsky–Zakaï bound, and the Bayesian Cramér–Rao bound. We propose some closed-form expressions of these bounds for a general Gaussian observation model with parameterized mean. In order to illustrate our results, we present simulation results in the context of a spectral analysis problem

    Adaptive array signal processing and performance analysis in non-Gaussian environments

    No full text
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1996.Includes bibliographical references (p. 146-151).by Christ D. Richmond.Ph.D

    Sample covariance based estimation of Capon algorithm error probabilities

    No full text
    The method of interval estimation (MIE) provides a strategy for mean squared error (MSE) prediction of algorithm performance at low signal-to-noise ratios (SNR) below estimation threshold where asymptotic predictions fail. MIE interval error probabilities for the Capon algorithm are known and depend on the true data covariance and assumed signal array response. Herein estimation of these error probabilities is considered to improve representative measurement errors for parameter estimates obtained in low SNR scenarios, as this may improve overall target tracking performance. A statistical analysis of Capon error probability estimation based on the data sample covariance matrix is explored herein.National Science Foundation (U.S.) (grant CCF-0829421)National Science Foundation (U.S.) (grant DMS-1035400

    A Fresh Look at the Bayesian Bounds of the Weiss-Weinstein Family

    No full text
    International audienceMinimal bounds on the mean square error (MSE) are generally used in order to predict the best achievable performance of an estimator for a given observation model. In this paper, we are interested in the Bayesian bound of the Weiss–Weinstein family. Among this family, we have Bayesian Cramér-Rao bound, the Bobrovsky–MayerWolf–Zakaï bound, the Bayesian Bhattacharyya bound, the Bobrovsky–Zakaï bound, the Reuven–Messer bound, and the Weiss–Weinstein bound. We present a unification of all these minimal bounds based on a rewriting of the minimum mean square error estimator (MMSEE) and on a constrained optimization problem. With this approach, we obtain a useful theoretical framework to derive new Bayesian bounds. For that purpose, we propose two bounds. First, we propose a generalization of the Bayesian Bhattacharyya bound extending the works of Bobrovsky, Mayer–Wolf, and Zakaï. Second, we propose a bound based on the Bayesian Bhattacharyya bound and on the Reuven–Messer bound, representing a generalization of these bounds. The proposed bound is the Bayesian extension of the deterministic Abel bound and is found to be tighter than the Bayesian Bhattacharyya bound, the Reuven–Messer bound, the Bobrovsky–Zakaï bound, and the Bayesian Cramér–Rao bound. We propose some closed-form expressions of these bounds for a general Gaussian observation model with parameterized mean. In order to illustrate our results, we present simulation results in the context of a spectral analysis problem
    corecore