242 research outputs found

    Simulation Methodology for Inference on Physical Parameters of Complex Vector-Valued Signals

    Get PDF
    Complex-valued vector time series occur in diverse fields such as oceanography and meteorology, and\ud scientifically interpretable parameters may be estimated from them. We show that it is possible to make\ud inference such as confidence intervals on these parameters using a vector-valued circulant embedding\ud simulation method, combined with bootstrapping. We apply the methodology to three parameters of\ud interest in oceanography, and compare the resulting simulated confidence intervals with those computed\ud using analytic results. We conclude that the simulation scheme offers an inference approach either in the\ud absence of theoretical distributional results, or to check the effect of nuisance parameters where theoretical\ud results are available

    Matrix products for the synthesis of stationary time series with a priori prescribed joint distributions

    Full text link
    Inspired from non-equilibrium statistical physics models, a general framework enabling the definition and synthesis of stationary time series with a priori prescribed and controlled joint distributions is constructed. Its central feature consists of preserving for the joint distribution the simple product struc- ture it has under independence while enabling to input con- trolled and prescribed dependencies amongst samples. To that end, it is based on products of d-dimensional matrices, whose entries consist of valid distributions. The statistical properties of the thus defined time series are studied in details. Having been able to recast this framework into that of Hidden Markov Models enabled us to obtain an efficient synthesis procedure. Pedagogical well-chosen examples (time series with the same marginal distribution, same covariance function, but different joint distributions) aim at illustrating the power and potential of the approach and at showing how targeted statistical prop- erties can be actually prescribed.Comment: 4 pages, 2 figures, conference publication published in IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 201

    Hybrid simulation scheme for volatility modulated moving average fields

    Get PDF
    We develop a simulation scheme for a class of spatial stochastic processes called volatility modulated moving averages. A characteristic feature of this model is that the behaviour of the moving average kernel at zero governs the roughness of realisations, whereas its behaviour away from zero determines the global properties of the process, such as long range dependence. Our simulation scheme takes this into account and approximates the moving average kernel by a power function around zero and by a step function elsewhere. For this type of approach the authors of [8], who considered an analogous model in one dimension, coined the expression hybrid simulation scheme. We derive the asymptotic mean square error of the simulation scheme and compare it in a simulation study with several other simulation techniques and exemplify its favourable performance in a simulation study

    Stochastic interpolation of sparsely sampled time series by a superstatistical random process and its synthesis in Fourier and wavelet space

    Full text link
    We present a novel method for stochastic interpolation of sparsely sampled time signals based on a superstatistical random process generated from a multivariate Gaussian scale mixture. In comparison to other stochastic interpolation methods such as Gaussian process regression, our method possesses strong multifractal properties and is thus applicable to a broad range of real-world time series, e.g. from solar wind or atmospheric turbulence. Furthermore, we provide a sampling algorithm in terms of a mixing procedure that consists of generating a 1 + 1-dimensional field u(t, {\xi}), where each Gaussian component u{\xi}(t) is synthesized with identical underlying noise but different covariance function C{\xi}(t,s) parameterized by a log-normally distributed parameter {\xi}. Due to the Gaussianity of each component u{\xi}(t), we can exploit standard sampling alogrithms such as Fourier or wavelet methods and, most importantly, methods to constrain the process on the sparse measurement points. The scale mixture u(t) is then initialized by assigning each point in time t a {\xi}(t) and therefore a specific value from u(t, {\xi}), where the time-dependent parameter {\xi}(t) follows a log-normal process with a large correlation time scale compared to the correlation time of u(t, {\xi}). We juxtapose Fourier and wavelet methods and show that a multiwavelet-based hierarchical approximation of the interpolating paths, which produce a sparse covariance structure, provide an adequate method to locally interpolate large and sparse datasets.Comment: 25 pages, 14 figure
    corecore