1,592 research outputs found

    A Monte-Carlo Approach to Zero Energy Quantum Scattering

    Get PDF
    Monte-Carlo methods for zero energy quantum scattering are developed. Starting from path integral representations for scattering observables, we present results of numerical calculations for potential scattering and scattering off a schematic 4He^4 \rm He nucleus. The convergence properties of Monte-Carlo algorithms for scattering systems are analyzed using stochastic differential equation as a path sampling method.Comment: 30 pages, LaTeX, 8 (uuencoded, tared and gziped) postscript figure

    Bootstrap Methods for Heavy-Tail or Autocorrelated Distributions with an Empirical Application

    Get PDF
    Chapter One: The Truncated Wild Bootstrap for the Asymmetric Infinite Variance Case The wild bootstrap method proposed by Cavaliere et al. (2013) to perform hypothesis testing for the location parameter in the location model, with errors in the domain of attraction of asymmetric stable law, is inappropriate. Hence, we are introducing a new bootstrap test procedure that overcomes the failure of Efron’s (1979) resampling bootstrap. This bootstrap test exploits the Wild Bootstrap of Cavaliere et al. (2013) and the central limit theorem of trimmed variables of Berkes et al. (2012) to deliver confidence sets with correct asymptotic coverage probabilities for asymmetric heavy-tailed data. The methodology of this bootstrap method entails locating cut-off values such that all data between these two values satisfy the central limit theorem conditions. Therefore, the proposed bootstrap will be termed the Truncated Wild Bootstrap (TWB) since it takes advantage of both findings. Simulation evidence to assess the quality of inference of available bootstrap tests for this particular model reveals that, on most occasions, the TWB performs better than the Parametric bootstrap (PB) of Cornea-Madeira & Davidson (2015). In addition, TWB test scheme is superior to the PB because this procedure can test the location parameter when the index of stability is below one, whereas the PB has no power in such a case. Moreover, the TWB is also superior to the PB when the tail index is close to 1 and the distribution is heavily skewed, unless the tail index is exactly 1 and the scale parameter is very high. Chapter Two: A frequency domain wild bootstrap for dependent data In this chapter a resampling method is proposed for a stationary dependent time series, based on Rademacher wild bootstrap draws from the Fourier transform of the data. The main distinguishing feature of our method is that the bootstrap draws share their periodogram identically with the sample, implying sound properties under dependence of arbitrary form. A drawback of the basic procedure is that the bootstrap distribution of the mean is degenerate. We show that a simple Gaussian augmentation overcomes this difficulty. Monte Carlo evidence indicates a favourable comparison with alternative methods in tests of location and significance in a regression model with autocorrelated shocks, and also of unit roots. Chapter 3: Frequency-based Bootstrap Methods for DC Pension Plan Strategy Evaluation The use of conventional bootstrap methods, such as Standard Bootstrap and Moving Block Bootstrap, to produce long run returns to rank one strategy over the others based on its associated reward and risk, might be misleading. Therefore, in this chapter, we will use a simple pension model that is mainly concerned with long-term accumulation wealth to assess, for the first time in pension literature, different bootstrap methods. We find that the Multivariate Fourier Bootstrap gives the most satisfactory result in its ability to mimic the true distribution using Cramer-von-mises statistics. We also address the disagreement in the pension literature on selecting the best pension plan strategy. We present a comprehensive study to compare different strategies using a different bootstrap procedures with different Cash-flow performance measures across a range of countries. We find that bootstrap methods play a critical role in determining the optimal strategy. Additionally, different CFP measures rank pension plans differently across countries and bootstrap methods.ESR

    Bayesian reconstruction of the cosmological large-scale structure: methodology, inverse algorithms and numerical optimization

    Full text link
    We address the inverse problem of cosmic large-scale structure reconstruction from a Bayesian perspective. For a linear data model, a number of known and novel reconstruction schemes, which differ in terms of the underlying signal prior, data likelihood, and numerical inverse extra-regularization schemes are derived and classified. The Bayesian methodology presented in this paper tries to unify and extend the following methods: Wiener-filtering, Tikhonov regularization, Ridge regression, Maximum Entropy, and inverse regularization techniques. The inverse techniques considered here are the asymptotic regularization, the Jacobi, Steepest Descent, Newton-Raphson, Landweber-Fridman, and both linear and non-linear Krylov methods based on Fletcher-Reeves, Polak-Ribiere, and Hestenes-Stiefel Conjugate Gradients. The structures of the up-to-date highest-performing algorithms are presented, based on an operator scheme, which permits one to exploit the power of fast Fourier transforms. Using such an implementation of the generalized Wiener-filter in the novel ARGO-software package, the different numerical schemes are benchmarked with 1-, 2-, and 3-dimensional problems including structured white and Poissonian noise, data windowing and blurring effects. A novel numerical Krylov scheme is shown to be superior in terms of performance and fidelity. These fast inverse methods ultimately will enable the application of sampling techniques to explore complex joint posterior distributions. We outline how the space of the dark-matter density field, the peculiar velocity field, and the power spectrum can jointly be investigated by a Gibbs-sampling process. Such a method can be applied for the redshift distortions correction of the observed galaxies and for time-reversal reconstructions of the initial density field.Comment: 40 pages, 11 figure

    Techniques and errors in measuring cross- correlation and cross-spectral density functions

    Get PDF
    Techniques and errors in measuring cross spectral density and cross correlation functions of stationary dynamic pressure dat

    Linear Reconstruction of Non-Stationary Image Ensembles Incorporating Blur and Noise Models

    Get PDF
    Two new linear reconstruction techniques are developed to improve the resolution of images collected by ground-based telescopes imaging through atmospheric turbulence. The classical approach involves the application of constrained least squares (CLS) to the deconvolution from wavefront sensing (DWFS) technique. The new algorithm incorporates blur and noise models to select the appropriate regularization constant automatically. In all cases examined, the Newton-Raphson minimization converged to a solution in less than 10 iterations. The non-iterative Bayesian approach involves the development of a new vector Wiener filter which is optimal with respect to mean square error (MSE) for a non-stationary object class degraded by atmospheric turbulence and measurement noise. This research involves the first extension of the Wiener filter to account properly for shot noise and an unknown, random optical transfer function (OTF). The vector Wiener filter provides superior reconstructions when compared to the traditional scalar Wiener filter for a non-stationary object class. In addition, the new filter can provide a superresolution capability when the object\u27s Fourier domain statistics are known for spatial frequencies beyond the OTF cutoff. A generalized performance and robustness study of the vector Wiener filter showed that MSE performance is fundamentally limited by object signal-to-noise ratio (SNR) and correlation between object pixels

    Efficient tracking of the cross-correlation coefficient

    Full text link

    Data decoding aided channel estimation techniques for OFDM systems in vehicular environment

    Get PDF
    L'oggetto del presente lavoro di tesi è costituito dallo studio e sviluppo di algoritmi di inseguimento di canale per sistemi basati su una modulazione di tipo Orthogonal Frequency Division Multiplexing (OFDM), con riferimento allo standard IEEE802.11p per comunicazioni mobili di tipo Wireless Local Area Network (WLAN), tra veicolo e veicolo e tra veicolo e infrastruttura. La caratteristica principale dei sistemi wireless in ambiente veicolare µe la presenza dell'effetto Doppler dovuto alla velocità relativa tra trasmettitore e ricevitore che rende il canale wireless tempo variante

    Computational Strategies in Lattice QCD

    Full text link
    Lectures given at the Summer School on "Modern perspectives in lattice QCD", Les Houches, August 3-28, 2009Comment: Latex source, 72 pages, 23 figures; v2: misprints corrected, minor text change

    Dimension Reduction for Time Series in a Blind Source Separation Context Using R

    Get PDF
    Multivariate time series observations are increasingly common in multiple fields of science but the complex dependencies of such data often translate into intractable models with large number of parameters. An alternative is given by first reducing the dimension of the series and then modelling the resulting uncorrelated signals univariately, avoiding the need for any covariance parameters. A popular and effective framework for this is blind source separation. In this paper we review the dimension reduction tools for time series available in the R package tsBSS. These include methods for estimating the signal dimension of second-order stationary time series, dimension reduction techniques for stochastic volatility models and supervised dimension reduction tools for time series regression. Several examples are provided to illustrate the functionality of the package
    corecore