10,967 research outputs found
Sequential Monte Carlo for fractional Stochastic Volatility Models
In this paper we consider a fractional stochastic volatility model, that is a
model in which the volatility may exhibit a long-range dependent or a
rough/antipersistent behavior. We propose a dynamic sequential Monte Carlo
methodology that is applicable to both long memory and antipersistent processes
in order to estimate the volatility as well as the unknown parameters of the
model. We establish a central limit theorem for the state and parameter filters
and we study asymptotic properties (consistency and asymptotic normality) for
the filter. We illustrate our results with a simulation study and we apply our
method to estimating the volatility and the parameters of a long-range
dependent model for S&P 500 data
State Space LSTM Models with Particle MCMC Inference
Long Short-Term Memory (LSTM) is one of the most powerful sequence models.
Despite the strong performance, however, it lacks the nice interpretability as
in state space models. In this paper, we present a way to combine the best of
both worlds by introducing State Space LSTM (SSL) models that generalizes the
earlier work \cite{zaheer2017latent} of combining topic models with LSTM.
However, unlike \cite{zaheer2017latent}, we do not make any factorization
assumptions in our inference algorithm. We present an efficient sampler based
on sequential Monte Carlo (SMC) method that draws from the joint posterior
directly. Experimental results confirms the superiority and stability of this
SMC inference algorithm on a variety of domains
Deep Recurrent Neural Network for Multi-target Filtering
This paper addresses the problem of fixed motion and measurement models for
multi-target filtering using an adaptive learning framework. This is performed
by defining target tuples with random finite set terminology and utilisation of
recurrent neural networks with a long short-term memory architecture. A novel
data association algorithm compatible with the predicted tracklet tuples is
proposed, enabling the update of occluded targets, in addition to assigning
birth, survival and death of targets. The algorithm is evaluated over a
commonly used filtering simulation scenario, with highly promising results.Comment: The 25th International Conference on MultiMedia Modeling (MMM
Long-Term Occupancy Grid Prediction Using Recurrent Neural Networks
We tackle the long-term prediction of scene evolution in a complex downtown
scenario for automated driving based on Lidar grid fusion and recurrent neural
networks (RNNs). A bird's eye view of the scene, including occupancy and
velocity, is fed as a sequence to a RNN which is trained to predict future
occupancy. The nature of prediction allows generation of multiple hours of
training data without the need of manual labeling. Thus, the training strategy
and loss function is designed for long sequences of real-world data
(unbalanced, continuously changing situations, false labels, etc.). The deep
CNN architecture comprises convolutional long short-term memories (ConvLSTMs)
to separate static from dynamic regions and to predict dynamic objects in
future frames. Novel recurrent skip connections show the ability to predict
small occluded objects, i.e. pedestrians, and occluded static regions.
Spatio-temporal correlations between grid cells are exploited to predict
multimodal future paths and interactions between objects. Experiments also
quantify improvements to our previous network, a Monte Carlo approach, and
literature.Comment: 8 pages, 10 figure
Big Learning with Bayesian Methods
Explosive growth in data and availability of cheap computing resources have
sparked increasing interest in Big learning, an emerging subfield that studies
scalable machine learning algorithms, systems, and applications with Big Data.
Bayesian methods represent one important class of statistic methods for machine
learning, with substantial recent developments on adaptive, flexible and
scalable Bayesian learning. This article provides a survey of the recent
advances in Big learning with Bayesian methods, termed Big Bayesian Learning,
including nonparametric Bayesian methods for adaptively inferring model
complexity, regularized Bayesian inference for improving the flexibility via
posterior regularization, and scalable algorithms and systems based on
stochastic subsampling and distributed computing for dealing with large-scale
applications.Comment: 21 pages, 6 figure
Filtering Point Targets via Online Learning of Motion Models
Filtering point targets in highly cluttered and noisy data frames can be very
challenging, especially for complex target motions. Fixed motion models can
fail to provide accurate predictions, while learning based algorithm can be
difficult to design (due to the variable number of targets), slow to train and
dependent on separate train/test steps. To address these issues, this paper
proposes a multi-target filtering algorithm which learns the motion models, on
the fly, using a recurrent neural network with a long short-term memory
architecture, as a regression block. The target state predictions are then
corrected using a novel data association algorithm, with a low computational
complexity. The proposed algorithm is evaluated over synthetic and real point
target filtering scenarios, demonstrating a remarkable performance over highly
cluttered data sequences.Comment: arXiv admin note: text overlap with arXiv:1806.0659
Deep learning algorithm for data-driven simulation of noisy dynamical system
We present a deep learning model, DE-LSTM, for the simulation of a stochastic
process with an underlying nonlinear dynamics. The deep learning model aims to
approximate the probability density function of a stochastic process via
numerical discretization and the underlying nonlinear dynamics is modeled by
the Long Short-Term Memory (LSTM) network. It is shown that, when the numerical
discretization is used, the function estimation problem can be solved by a
multi-label classification problem. A penalized maximum log likelihood method
is proposed to impose a smoothness condition in the prediction of the
probability distribution. We show that the time evolution of the probability
distribution can be computed by a high-dimensional integration of the
transition probability of the LSTM internal states. A Monte Carlo algorithm to
approximate the high-dimensional integration is outlined. The behavior of
DE-LSTM is thoroughly investigated by using the Ornstein-Uhlenbeck process and
noisy observations of nonlinear dynamical systems; Mackey-Glass time series and
forced Van der Pol oscillator. It is shown that DE-LSTM makes a good prediction
of the probability distribution without assuming any distributional properties
of the stochastic process. For a multiple-step forecast of the Mackey-Glass
time series, the prediction uncertainty, denoted by the 95\% confidence
interval, first grows, then dynamically adjusts following the evolution of the
system, while in the simulation of the forced Van der Pol oscillator, the
prediction uncertainty does not grow in time even for a 3,000-step forecast
Recommended from our members
Uncertainty assessment of hydrologic model states and parameters: Sequential data assimilation using the particle filter
Two elementary issues in contemporary Earth system science and engineering are (1) the specification of model parameter values which characterize a system and (2) the estimation of state variables which express the system dynamic. This paper explores a novel sequential hydrologic data assimilation approach for estimating model parameters and state variables using particle filters (PFs). PFs have their origin in Bayesian estimation. Methods for batch calibration, despite major recent advances, appear to lack the flexibility required to treat uncertainties in the current system as new information is received. Methods based on sequential Bayesian estimation seem better able to take advantage of the temporal organization and structure of information, so that better compliance of the model output with observations can be achieved. Such methods provide platforms for improved uncertainty assessment and estimation of hydrologic model components, by providing more complete and accurate representations of the forecast and analysis probability distributions. This paper introduces particle filtering as a sequential Bayesian filtering having features that represent the full probability distribution of predictive uncertainties. Particle filters have, so far, generally been used to recursively estimate the posterior distribution of the model state; this paper investigates their applicability to the approximation of the posterior distribution of parameters. The capability and usefulness of particle filters for adaptive inference of the joint posterior distribution of the parameters and state variables are illustrated via two case studies using a parsimonious conceptual hydrologic model. Copyright 2005 by the American Geophysical Union
Ensemble transform algorithms for nonlinear smoothing problems
Several numerical tools designed to overcome the challenges of smoothing in a
nonlinear and non-Gaussian setting are investigated for a class of particle
smoothers. The considered family of smoothers is induced by the class of linear
ensemble transform filters which contains classical filters such as the
stochastic ensemble Kalman filter, the ensemble square root filter and the
recently introduced nonlinear ensemble transform filter. Further the ensemble
transform particle smoother is introduced and particularly highlighted as it is
consistent in the particle limit and does not require assumptions with respect
to the family of the posterior distribution. The linear update pattern of the
considered class of linear ensemble transform smoothers allows one to implement
important supplementary techniques such as adaptive spread corrections, hybrid
formulations, and localization in order to facilitate their application to
complex estimation problems. These additional features are derived and
numerically investigated for a sequence of increasingly challenging test
problems
The Wigner branching random walk: Efficient implementation and performance evaluation
To implement the Wigner branching random walk, the particle carrying a signed
weight, either or , is more friendly to data storage and arithmetic
manipulations than that taking a real-valued weight continuously from to
. The former is called a signed particle and the latter a weighted
particle. In this paper, we propose two efficient strategies to realize the
signed-particle implementation. One is to interpret the multiplicative
functional as the probability to generate pairs of particles instead of the
incremental weight, and the other is to utilize a bootstrap filter to adjust
the skewness of particle weights. Performance evaluations on the Gaussian
barrier scattering (2D) and a Helium-like system (4D) demonstrate the
feasibility of both strategies and the variance reduction property of the
second approach. We provide an improvement of the first signed-particle
implementation that partially alleviates the restriction on the time step and
perform a thorough theoretical and numerical comparison among all the existing
signed-particle implementations. Details on implementing the importance
sampling according to the quasi-probability density and an efficient resampling
or particle reduction are also provided.Comment: Submitted for publication on Sep. 6, 201
- …