1,046 research outputs found
Truncated Moment Problem for Dirac Mixture Densities with Entropy Regularization
We assume that a finite set of moments of a random vector is given. Its
underlying density is unknown. An algorithm is proposed for efficiently
calculating Dirac mixture densities maintaining these moments while providing a
homogeneous coverage of the state space.Comment: 18 pages, 6 figure
Continuous, Semi-discrete, and Fully Discretized Navier-Stokes Equations
The Navier--Stokes equations are commonly used to model and to simulate flow
phenomena. We introduce the basic equations and discuss the standard methods
for the spatial and temporal discretization. We analyse the semi-discrete
equations -- a semi-explicit nonlinear DAE -- in terms of the strangeness index
and quantify the numerical difficulties in the fully discrete schemes, that are
induced by the strangeness of the system. By analyzing the Kronecker index of
the difference-algebraic equations, that represent commonly and successfully
used time stepping schemes for the Navier--Stokes equations, we show that those
time-integration schemes factually remove the strangeness. The theoretical
considerations are backed and illustrated by numerical examples.Comment: 28 pages, 2 figure, code available under DOI: 10.5281/zenodo.998909,
https://doi.org/10.5281/zenodo.99890
A random map implementation of implicit filters
Implicit particle filters for data assimilation generate high-probability
samples by representing each particle location as a separate function of a
common reference variable. This representation requires that a certain
underdetermined equation be solved for each particle and at each time an
observation becomes available. We present a new implementation of implicit
filters in which we find the solution of the equation via a random map. As
examples, we assimilate data for a stochastically driven Lorenz system with
sparse observations and for a stochastic Kuramoto-Sivashinski equation with
observations that are sparse in both space and time
Discretizing Distributions with Exact Moments: Error Estimate and Convergence Analysis
The maximum entropy principle is a powerful tool for solving underdetermined
inverse problems. This paper considers the problem of discretizing a continuous
distribution, which arises in various applied fields. We obtain the
approximating distribution by minimizing the Kullback-Leibler information
(relative entropy) of the unknown discrete distribution relative to an initial
discretization based on a quadrature formula subject to some moment
constraints. We study the theoretical error bound and the convergence of this
approximation method as the number of discrete points increases. We prove that
(i) the theoretical error bound of the approximate expectation of any bounded
continuous function has at most the same order as the quadrature formula we
start with, and (ii) the approximate discrete distribution weakly converges to
the given continuous distribution. Moreover, we present some numerical examples
that show the advantage of the method and apply to numerically solving an
optimal portfolio problem.Comment: 20 pages, 14 figure
- …