894 research outputs found
Fast change point analysis on the Hurst index of piecewise fractional Brownian motion
In this presentation, we introduce a new method for change point analysis on
the Hurst index for a piecewise fractional Brownian motion. We first set the
model and the statistical problem. The proposed method is a transposition of
the FDpV (Filtered Derivative with p-value) method introduced for the detection
of change points on the mean in Bertrand et al. (2011) to the case of changes
on the Hurst index. The underlying statistics of the FDpV technology is a new
statistic estimator for Hurst index, so-called Increment Bernoulli Statistic
(IBS). Both FDpV and IBS are methods with linear time and memory complexity,
with respect to the size of the series. Thus the resulting method for change
point analysis on Hurst index reaches also a linear complexity
A brief history of long memory: Hurst, Mandelbrot and the road to ARFIMA
Long memory plays an important role in many fields by determining the
behaviour and predictability of systems; for instance, climate, hydrology,
finance, networks and DNA sequencing. In particular, it is important to test if
a process is exhibiting long memory since that impacts the accuracy and
confidence with which one may predict future events on the basis of a small
amount of historical data. A major force in the development and study of long
memory was the late Benoit B. Mandelbrot. Here we discuss the original
motivation of the development of long memory and Mandelbrot's influence on this
fascinating field. We will also elucidate the sometimes contrasting approaches
to long memory in different scientific communitiesComment: 40 page
Self-similar prior and wavelet bases for hidden incompressible turbulent motion
This work is concerned with the ill-posed inverse problem of estimating
turbulent flows from the observation of an image sequence. From a Bayesian
perspective, a divergence-free isotropic fractional Brownian motion (fBm) is
chosen as a prior model for instantaneous turbulent velocity fields. This
self-similar prior characterizes accurately second-order statistics of velocity
fields in incompressible isotropic turbulence. Nevertheless, the associated
maximum a posteriori involves a fractional Laplacian operator which is delicate
to implement in practice. To deal with this issue, we propose to decompose the
divergent-free fBm on well-chosen wavelet bases. As a first alternative, we
propose to design wavelets as whitening filters. We show that these filters are
fractional Laplacian wavelets composed with the Leray projector. As a second
alternative, we use a divergence-free wavelet basis, which takes implicitly
into account the incompressibility constraint arising from physics. Although
the latter decomposition involves correlated wavelet coefficients, we are able
to handle this dependence in practice. Based on these two wavelet
decompositions, we finally provide effective and efficient algorithms to
approach the maximum a posteriori. An intensive numerical evaluation proves the
relevance of the proposed wavelet-based self-similar priors.Comment: SIAM Journal on Imaging Sciences, 201
Bayesian Inference for partially observed SDEs Driven by Fractional Brownian Motion
We consider continuous-time diffusion models driven by fractional Brownian
motion. Observations are assumed to possess a non-trivial likelihood given the
latent path. Due to the non-Markovianity and high-dimensionality of the latent
paths, estimating posterior expectations is a computationally challenging
undertaking. We present a reparameterization framework based on the Davies and
Harte method for sampling stationary Gaussian processes and use this framework
to construct a Markov chain Monte Carlo algorithm that allows computationally
efficient Bayesian inference. The Markov chain Monte Carlo algorithm is based
on a version of hybrid Monte Carlo that delivers increased efficiency when
applied on the high-dimensional latent variables arising in this context. We
specify the methodology on a stochastic volatility model allowing for memory in
the volatility increments through a fractional specification. The methodology
is illustrated on simulated data and on the S&P500/VIX time series and is shown
to be effective. Contrary to a long range dependence attribute of such models
often assumed in the literature, with Hurst parameter larger than 1/2, the
posterior distribution favours values smaller than 1/2, pointing towards medium
range dependence
- …