13,296 research outputs found
Multiplicative local linear hazard estimation and best one-sided cross-validation
This paper develops detailed mathematical statistical theory of a new class of cross-validation techniques of local linear kernel hazards and their multiplicative bias corrections. The new class of cross-validation combines principles of local information and recent advances in indirect cross-validation. A few applications of cross-validating multiplicative kernel hazard estimation do exist in the literature. However, detailed mathematical statistical theory and small sample performance are introduced via this paper and further upgraded to our new class of best one-sided cross-validation. Best one-sided cross-validation turns out to have excellent performance in its practical illustrations, in its small sample performance and in its mathematical statistical theoretical performance
Shortcomings of a parametric VaR approach and nonparametric improvements based on a non-stationary return series model
A non-stationary regression model for financial returns is examined theoretically in this paper. Volatility dynamics are modelled both exogenously and deterministic, captured by a nonparametric curve estimation on equidistant centered returns. We prove consistency and asymptotic normality of a symmetric variance estimator and of a one-sided variance estimator analytically, and derive remarks on the bandwidth decision. Further attention is paid to asymmetry and heavy tails of the return distribution, implemented by an asymmetric version of the Pearson type VII distribution for random innovations. By providing a method of moments for its parameter estimation and a connection to the Student-t distribution we offer the framework for a factor-based VaR approach. The approximation quality of the non-stationary model is supported by simulation studies. --heteroscedastic asset returns,non-stationarity,nonparametric regression,volatility,innovation modelling,asymmetric heavy-tails,distributional forecast,Value at Risk (VaR)
The Local Fractional Bootstrap
We introduce a bootstrap procedure for high-frequency statistics of Brownian
semistationary processes. More specifically, we focus on a hypothesis test on
the roughness of sample paths of Brownian semistationary processes, which uses
an estimator based on a ratio of realized power variations. Our new resampling
method, the local fractional bootstrap, relies on simulating an auxiliary
fractional Brownian motion that mimics the fine properties of high frequency
differences of the Brownian semistationary process under the null hypothesis.
We prove the first order validity of the bootstrap method and in simulations we
observe that the bootstrap-based hypothesis test provides considerable
finite-sample improvements over an existing test that is based on a central
limit theorem. This is important when studying the roughness properties of time
series data; we illustrate this by applying the bootstrap method to two
empirical data sets: we assess the roughness of a time series of high-frequency
asset prices and we test the validity of Kolmogorov's scaling law in
atmospheric turbulence data
Smoothed Particle Hydrodynamics in cosmology: a comparative study of implementations
We analyse the performance of twelve different implementations of Smoothed
Particle Hydrodynamics (SPH) using seven tests designed to isolate key
hydrodynamic elements of cosmological simulations which are known to cause the
SPH algorithm problems. In order, we consider a shock tube, spherical adiabatic
collapse, cooling flow model, drag, a cosmological simulation, rotating
cloud-collapse and disc stability. In the implementations special attention is
given to the way in which force symmetry is enforced in the equations of
motion. We study in detail how the hydrodynamics are affected by different
implementations of the artificial viscosity including those with a
shear-correction modification. We present an improved first-order
smoothing-length update algorithm that is designed to remove instabilities that
are present in the Hernquist and Katz (1989) algorithm.
For all tests we find that the artificial viscosity is the most important
factor distinguishing the results from the various implementations. The second
most important factor is the way force symmetry is achieved in the equation of
motion. Most results favour a kernel symmetrization approach. The exact method
by which SPH pressure forces are included has comparatively little effect on
the results. Combining the equation of motion presented in Thomas and Couchman
(1992) with a modification of the Monaghan and Gingold (1983) artificial
viscosity leads to an SPH scheme that is both fast and reliable.Comment: 30 pages, 26 figures and 9 tables included. Submitted to MNRAS.
Postscript version available at
ftp://phobos.astro.uwo.ca/pub/etittley/papers/sphtest.ps.g
Hybrid scheme for Brownian semistationary processes
We introduce a simulation scheme for Brownian semistationary processes, which
is based on discretizing the stochastic integral representation of the process
in the time domain. We assume that the kernel function of the process is
regularly varying at zero. The novel feature of the scheme is to approximate
the kernel function by a power function near zero and by a step function
elsewhere. The resulting approximation of the process is a combination of
Wiener integrals of the power function and a Riemann sum, which is why we call
this method a hybrid scheme. Our main theoretical result describes the
asymptotics of the mean square error of the hybrid scheme and we observe that
the scheme leads to a substantial improvement of accuracy compared to the
ordinary forward Riemann-sum scheme, while having the same computational
complexity. We exemplify the use of the hybrid scheme by two numerical
experiments, where we examine the finite-sample properties of an estimator of
the roughness parameter of a Brownian semistationary process and study Monte
Carlo option pricing in the rough Bergomi model of Bayer et al. [Quant. Finance
16(6), 887-904, 2016], respectively.Comment: 33 pages, 4 figures, v4: minor revision, in particular we have
derived a new expression (3.5), equivalent to the previous one but
numerically more convenient, for the off-diagonal elements of the covariance
matrix Sigm
Practical Statistics for the LHC
This document is a pedagogical introduction to statistics for particle
physics. Emphasis is placed on the terminology, concepts, and methods being
used at the Large Hadron Collider. The document addresses both the statistical
tests applied to a model of the data and the modeling itself.Comment: presented at the 2011 European School of High-Energy Physics, Cheile
Gradistei, Romania, 7-20 September 2011 I expect to release updated versions
of this document in the futur
- …