4 research outputs found

    Levenberg-Marquardt and Line-Search Extended Kalman Smoothers

    Get PDF
    The aim of this article is to present Levenberg–Marquardt and line-search extensions of the classical iterated extended Kalman smoother (IEKS) which has previously been shown to be equivalent to the Gauss–Newton method. The algo- rithms are derived by rewriting the algorithm’s steps in forms that can be efficiently implemented using modified EKS iter- ations. The resulting algorithms are experimentally shown to have superior convergence properties over the classical IEKS

    Approximate Gaussian conjugacy: parametric recursive filtering under nonlinearity, multimodality, uncertainty, and constraint, and beyond

    Get PDF
    Since the landmark work of R. E. Kalman in the 1960s, considerable efforts have been devoted to time series state space models for a large variety of dynamic estimation problems. In particular, parametric filters that seek analytical estimates based on a closed-form Markov–Bayes recursion, e.g., recursion from a Gaussian or Gaussian mixture (GM) prior to a Gaussian/GM posterior (termed ‘Gaussian conjugacy’ in this paper), form the backbone for a general time series filter design. Due to challenges arising from nonlinearity, multimodality (including target maneuver), intractable uncertainties (such as unknown inputs and/or non-Gaussian noises) and constraints (including circular quantities), etc., new theories, algorithms, and technologies have been developed continuously to maintain such a conjugacy, or to approximate it as close as possible. They had contributed in large part to the prospective developments of time series parametric filters in the last six decades. In this paper, we review the state of the art in distinctive categories and highlight some insights that may otherwise be easily overlooked. In particular, specific attention is paid to nonlinear systems with an informative observation, multimodal systems including Gaussian mixture posterior and maneuvers, and intractable unknown inputs and constraints, to fill some gaps in existing reviews and surveys. In addition, we provide some new thoughts on alternatives to the first-order Markov transition model and on filter evaluation with regard to computing complexity

    Gaussian MAP Filtering Using Kalman Optimization

    Get PDF
    © 1963-2012 IEEE. This paper deals with the update step of Gaussian MAP filtering. In this framework, we seek a Gaussian approximation to the posterior probability density function (PDF) whose mean is given by the maximum a posteriori (MAP) estimator. We propose two novel optimization algorithms which are quite suitable for finding the MAP estimate although they can also be used to solve general optimization problems. These are based on the design of a sequence of PDFs that become increasingly concentrated around the MAP estimate. The resulting algorithms are referred to as Kalman optimization (KO) methods. We also provide the important relations between these KO methods and their conventional optimization algorithms (COAs) counterparts, i.e., Newton's and Levenberg-Marquardt algorithms. Our simulations indicate that KO methods are more robust than their COA equivalents
    corecore