24,039 research outputs found

    Data Assimilation by Conditioning on Future Observations

    Full text link
    Conventional recursive filtering approaches, designed for quantifying the state of an evolving uncertain dynamical system with intermittent observations, use a sequence of (i) an uncertainty propagation step followed by (ii) a step where the associated data is assimilated using Bayes' rule. In this paper we switch the order of the steps to: (i) one step ahead data assimilation followed by (ii) uncertainty propagation. This route leads to a class of filtering algorithms named \emph{smoothing filters}. For a system driven by random noise, our proposed methods require the probability distribution of the driving noise after the assimilation to be biased by a nonzero mean. The system noise, conditioned on future observations, in turn pushes forward the filtering solution in time closer to the true state and indeed helps to find a more accurate approximate solution for the state estimation problem

    The Kalman-Levy filter

    Full text link
    The Kalman filter combines forecasts and new observations to obtain an estimation which is optimal in the sense of a minimum average quadratic error. The Kalman filter has two main restrictions: (i) the dynamical system is assumed linear and (ii) forecasting errors and observational noises are taken Gaussian. Here, we offer an important generalization to the case where errors and noises have heavy tail distributions such as power laws and L\'evy laws. The main tool needed to solve this ``Kalman-L\'evy'' filter is the ``tail-covariance'' matrix which generalizes the covariance matrix in the case where it is mathematically ill-defined (i.e. for power law tail exponents μ≤2\mu \leq 2). We present the general solution and discuss its properties on pedagogical examples. The standard Kalman-Gaussian filter is recovered for the case μ=2\mu = 2. The optimal Kalman-L\'evy filter is found to deviate substantially fro the standard Kalman-Gaussian filter as μ\mu deviates from 2. As μ\mu decreases, novel observations are assimilated with less and less weight as a small exponent μ\mu implies large errors with significant probabilities. In terms of implementation, the price-to-pay associated with the presence of heavy tail noise distributions is that the standard linear formalism valid for the Gaussian case is transformed into a nonlinear matrice equation for the Kalman-L\'evy filter. Direct numerical experiments in the univariate case confirms our theoretical predictions.Comment: 41 pages, 9 figures, correction of errors in the general multivariate cas

    Open Boundaries for the Nonlinear Schrodinger Equation

    Full text link
    We present a new algorithm, the Time Dependent Phase Space Filter (TDPSF) which is used to solve time dependent Nonlinear Schrodinger Equations (NLS). The algorithm consists of solving the NLS on a box with periodic boundary conditions (by any algorithm). Periodically in time we decompose the solution into a family of coherent states. Coherent states which are outgoing are deleted, while those which are not are kept, reducing the problem of reflected (wrapped) waves. Numerical results are given, and rigorous error estimates are described. The TDPSF is compatible with spectral methods for solving the interior problem. The TDPSF also fails gracefully, in the sense that the algorithm notifies the user when the result is incorrect. We are aware of no other method with this capability.Comment: 21 pages, 4 figure
    • …
    corecore