1,611 research outputs found
Variational Bayesian Inference of Line Spectra
In this paper, we address the fundamental problem of line spectral estimation
in a Bayesian framework. We target model order and parameter estimation via
variational inference in a probabilistic model in which the frequencies are
continuous-valued, i.e., not restricted to a grid; and the coefficients are
governed by a Bernoulli-Gaussian prior model turning model order selection into
binary sequence detection. Unlike earlier works which retain only point
estimates of the frequencies, we undertake a more complete Bayesian treatment
by estimating the posterior probability density functions (pdfs) of the
frequencies and computing expectations over them. Thus, we additionally capture
and operate with the uncertainty of the frequency estimates. Aiming to maximize
the model evidence, variational optimization provides analytic approximations
of the posterior pdfs and also gives estimates of the additional parameters. We
propose an accurate representation of the pdfs of the frequencies by mixtures
of von Mises pdfs, which yields closed-form expectations. We define the
algorithm VALSE in which the estimates of the pdfs and parameters are
iteratively updated. VALSE is a gridless, convergent method, does not require
parameter tuning, can easily include prior knowledge about the frequencies and
provides approximate posterior pdfs based on which the uncertainty in line
spectral estimation can be quantified. Simulation results show that accounting
for the uncertainty of frequency estimates, rather than computing just point
estimates, significantly improves the performance. The performance of VALSE is
superior to that of state-of-the-art methods and closely approaches the
Cram\'er-Rao bound computed for the true model order.Comment: 15 pages, 8 figures, accepted for publication in IEEE Transactions on
Signal Processin
Data Assimilation with Gaussian Mixture Models using the Dynamically Orthogonal Field Equations. Part II. Applications
The properties and capabilities of the GMM-DO filter are assessed and exemplified by applications to two dynamical systems: (1) the Double Well Diffusion and (2) Sudden Expansion flows; both of which admit far-from-Gaussian statistics. The former test case, or twin experiment, validates the use of the EM algorithm and Bayesian Information Criterion with Gaussian Mixture Models in a filtering context; the latter further exemplifies its ability to efficiently handle state vectors of non-trivial dimensionality and dynamics with jets and eddies. For each test case, qualitative and quantitative comparisons are made with contemporary filters. The sensitivity to input parameters is illustrated and discussed. Properties of the filter are examined and its estimates are described, including: the equation-based and adaptive prediction of the probability densities; the evolution of the mean field, stochastic subspace modes and stochastic coefficients; the fitting of Gaussian Mixture Models; and, the efficient and analytical Bayesian updates at assimilation times and the corresponding data impacts. The advantages of respecting nonlinear dynamics and preserving non-Gaussian statistics are brought to light. For realistic test cases admitting complex distributions and with sparse or noisy measurements, the GMM-DO filter is shown to fundamentally improve the filtering skill, outperforming simpler schemes invoking the Gaussian parametric distribution
A Bayesian fusion model for space-time reconstruction of finely resolved velocities in turbulent flows from low resolution measurements
The study of turbulent flows calls for measurements with high resolution both
in space and in time. We propose a new approach to reconstruct
High-Temporal-High-Spatial resolution velocity fields by combining two sources
of information that are well-resolved either in space or in time, the
Low-Temporal-High-Spatial (LTHS) and the High-Temporal-Low-Spatial (HTLS)
resolution measurements. In the framework of co-conception between sensing and
data post-processing, this work extensively investigates a Bayesian
reconstruction approach using a simulated database. A Bayesian fusion model is
developed to solve the inverse problem of data reconstruction. The model uses a
Maximum A Posteriori estimate, which yields the most probable field knowing the
measurements. The DNS of a wall-bounded turbulent flow at moderate Reynolds
number is used to validate and assess the performances of the present approach.
Low resolution measurements are subsampled in time and space from the fully
resolved data. Reconstructed velocities are compared to the reference DNS to
estimate the reconstruction errors. The model is compared to other conventional
methods such as Linear Stochastic Estimation and cubic spline interpolation.
Results show the superior accuracy of the proposed method in all
configurations. Further investigations of model performances on various range
of scales demonstrate its robustness. Numerical experiments also permit to
estimate the expected maximum information level corresponding to limitations of
experimental instruments.Comment: 15 pages, 6 figure
Bayesian Learning of Coupled Biogeochemical-Physical Models
Predictive dynamical models for marine ecosystems are used for a variety of
needs. Due to sparse measurements and limited understanding of the myriad of
ocean processes, there is however significant uncertainty. There is model
uncertainty in the parameter values, functional forms with diverse
parameterizations, level of complexity needed, and thus in the state fields. We
develop a Bayesian model learning methodology that allows interpolation in the
space of candidate models and discovery of new models from noisy, sparse, and
indirect observations, all while estimating state fields and parameter values,
as well as the joint PDFs of all learned quantities. We address the challenges
of high-dimensional and multidisciplinary dynamics governed by PDEs by using
state augmentation and the computationally efficient GMM-DO filter. Our
innovations include stochastic formulation and complexity parameters to unify
candidate models into a single general model as well as stochastic expansion
parameters within piecewise function approximations to generate dense candidate
model spaces. These innovations allow handling many compatible and embedded
candidate models, possibly none of which are accurate, and learning elusive
unknown functional forms. Our new methodology is generalizable, interpretable,
and extrapolates out of the space of models to discover new ones. We perform a
series of twin experiments based on flows past a ridge coupled with
three-to-five component ecosystem models, including flows with chaotic
advection. The probabilities of known, uncertain, and unknown model
formulations, and of state fields and parameters, are updated jointly using
Bayes' law. Non-Gaussian statistics, ambiguity, and biases are captured. The
parameter values and model formulations that best explain the data are
identified. When observations are sufficiently informative, model complexity
and functions are discovered.Comment: 45 pages; 18 figures; 2 table
- …