442 research outputs found

    Lower Bounds on Exponential Moments of the Quadratic Error in Parameter Estimation

    Full text link
    Considering the problem of risk-sensitive parameter estimation, we propose a fairly wide family of lower bounds on the exponential moments of the quadratic error, both in the Bayesian and the non--Bayesian regime. This family of bounds, which is based on a change of measures, offers considerable freedom in the choice of the reference measure, and our efforts are devoted to explore this freedom to a certain extent. Our focus is mostly on signal models that are relevant to communication problems, namely, models of a parameter-dependent signal (modulated signal) corrupted by additive white Gaussian noise, but the methodology proposed is also applicable to other types of parametric families, such as models of linear systems driven by random input signals (white noise, in most cases), and others. In addition to the well known motivations of the risk-sensitive cost function (i.e., the exponential quadratic cost function), which is most notably, the robustness to model uncertainty, we also view this cost function as a tool for studying fundamental limits concerning the tail behavior of the estimation error. Another interesting aspect, that we demonstrate in a certain parametric model, is that the risk-sensitive cost function may be subjected to phase transitions, owing to some analogies with statistical mechanics.Comment: 28 pages; 4 figures; submitted for publicatio

    Performance Bounds for Parameter Estimation under Misspecified Models: Fundamental findings and applications

    Full text link
    Inferring information from a set of acquired data is the main objective of any signal processing (SP) method. In particular, the common problem of estimating the value of a vector of parameters from a set of noisy measurements is at the core of a plethora of scientific and technological advances in the last decades; for example, wireless communications, radar and sonar, biomedicine, image processing, and seismology, just to name a few. Developing an estimation algorithm often begins by assuming a statistical model for the measured data, i.e. a probability density function (pdf) which if correct, fully characterizes the behaviour of the collected data/measurements. Experience with real data, however, often exposes the limitations of any assumed data model since modelling errors at some level are always present. Consequently, the true data model and the model assumed to derive the estimation algorithm could differ. When this happens, the model is said to be mismatched or misspecified. Therefore, understanding the possible performance loss or regret that an estimation algorithm could experience under model misspecification is of crucial importance for any SP practitioner. Further, understanding the limits on the performance of any estimator subject to model misspecification is of practical interest. Motivated by the widespread and practical need to assess the performance of a mismatched estimator, the goal of this paper is to help to bring attention to the main theoretical findings on estimation theory, and in particular on lower bounds under model misspecification, that have been published in the statistical and econometrical literature in the last fifty years. Secondly, some applications are discussed to illustrate the broad range of areas and problems to which this framework extends, and consequently the numerous opportunities available for SP researchers.Comment: To appear in the IEEE Signal Processing Magazin

    Bayesian Cramér-Rao Lower Bound for Magnetic Field-Based Localization

    Get PDF
    In this paper, we show how to analyze the achievable position accuracy of magnetic localization based on Bayesian Cramér-Rao lower bounds and how to account for deterministic inputs in the bound. The derivation of the bound requires an analytical model, e.g., a map or database, that links the position that is to be estimated to the corresponding magnetic field value. Unfortunately, finding an analytical model from the laws of physics is not feasible due to the complexity of the involved differential equations and the required knowledge about the environment. In this paper, we therefore use a Gaussian process (GP) that approximates the true analytical model based on training data. The GP ensures a smooth, differentiable likelihood and allows a strict Bayesian treatment of the estimation problem. Based on a novel set of measurements recorded in an indoor environment, the bound is evaluated for different sensor heights and is compared to the mean squared error of a particle filter. Furthermore, the bound is calculated for the case when only the magnetic magnitude is used for positioning and the case when the whole vector field is considered. For both cases, the resulting position bound is below 10cm indicating an high potential accuracy of magnetic localization

    Parameter estimation for peaky altimetric waveforms

    Get PDF
    Much attention has been recently devoted to the analysis of coastal altimetric waveforms. When approaching the coast, altimetric waveforms are sometimes corrupted by peaks caused by high reflective areas inside the illuminated land surfaces or by the modification of the sea state close to the shoreline. This paper introduces a new parametric model for these peaky altimetric waveforms. This model assumes that the received altimetric waveform is the sum of a Brown echo and an asymmetric Gaussian peak. The asymmetric Gaussian peak is parameterized by a location, an amplitude, a width, and an asymmetry coefficient. A maximum-likelihood estimator is studied to estimate the Brown plus peak model parameters. The Cramér–Rao lower bounds of the model parameters are then derived providing minimum variances for any unbiased estimator, i.e., a reference in terms of estimation error. The performance of the proposed model and the resulting estimation strategy are evaluated via many simulations conducted on synthetic and real data. Results obtained in this paper show that the proposed model can be used to retrack efficiently standard oceanic Brown echoes as well as coastal echoes corrupted by symmetric or asymmetric Gaussian peaks. Thus, the Brown with Gaussian peak model is useful for analyzing altimetric easurements closer to the coast

    Approximate Gaussian conjugacy: parametric recursive filtering under nonlinearity, multimodality, uncertainty, and constraint, and beyond

    Get PDF
    Since the landmark work of R. E. Kalman in the 1960s, considerable efforts have been devoted to time series state space models for a large variety of dynamic estimation problems. In particular, parametric filters that seek analytical estimates based on a closed-form Markov–Bayes recursion, e.g., recursion from a Gaussian or Gaussian mixture (GM) prior to a Gaussian/GM posterior (termed ‘Gaussian conjugacy’ in this paper), form the backbone for a general time series filter design. Due to challenges arising from nonlinearity, multimodality (including target maneuver), intractable uncertainties (such as unknown inputs and/or non-Gaussian noises) and constraints (including circular quantities), etc., new theories, algorithms, and technologies have been developed continuously to maintain such a conjugacy, or to approximate it as close as possible. They had contributed in large part to the prospective developments of time series parametric filters in the last six decades. In this paper, we review the state of the art in distinctive categories and highlight some insights that may otherwise be easily overlooked. In particular, specific attention is paid to nonlinear systems with an informative observation, multimodal systems including Gaussian mixture posterior and maneuvers, and intractable unknown inputs and constraints, to fill some gaps in existing reviews and surveys. In addition, we provide some new thoughts on alternatives to the first-order Markov transition model and on filter evaluation with regard to computing complexity

    Parameter Estimation

    Get PDF
    corecore