10 research outputs found
Pareto Smoothed Importance Sampling
Importance weighting is a general way to adjust Monte Carlo integration to
account for draws from the wrong distribution, but the resulting estimate can
be noisy when the importance ratios have a heavy right tail. This routinely
occurs when there are aspects of the target distribution that are not well
captured by the approximating distribution, in which case more stable estimates
can be obtained by modifying extreme importance ratios. We present a new method
for stabilizing importance weights using a generalized Pareto distribution fit
to the upper tail of the distribution of the simulated importance ratios. The
method, which empirically performs better than existing methods for stabilizing
importance sampling estimates, includes stabilized effective sample size
estimates, Monte Carlo error estimates and convergence diagnostics.Comment: Major revision: 1) proofs for consistency, finite variance, and
asymptotic normality, 2) justification of k<0.7 with theoretical
computational complexity analysis, 3) major rewrit
Order-statistics-based inferences for censored lifetime data and financial risk analysis
This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University.This thesis focuses on applying order-statistics-based inferences on lifetime analysis and financial risk measurement. The first problem is raised from fitting the Weibull distribution to progressively censored and accelerated life-test data. A new orderstatistics- based inference is proposed for both parameter and con dence interval estimation. The second problem can be summarised as adopting the inference used in the first problem for fitting the generalised Pareto distribution, especially when sample size is small. With some modifications, the proposed inference is compared with classical methods and several relatively new methods emerged from recent literature. The third problem studies a distribution free approach for forecasting financial volatility, which is essentially the standard deviation of financial returns. Classical models of this approach use the interval between two symmetric extreme quantiles of the return distribution as a proxy of volatility. Two new models are proposed, which use intervals of expected shortfalls and expectiles, instead of interval of quantiles. Different models are compared with empirical stock indices data.
Finally, attentions are drawn towards the heteroskedasticity quantile regression. The
proposed joint modelling approach, which makes use of the parametric link between
the quantile regression and the asymmetric Laplace distribution, can provide estimations
of the regression quantile and of the log linear heteroskedastic scale simultaneously.
Furthermore, the use of the expectation of the check function as a measure of
quantile deviation is discussed
Modeling time varying and multivariate environmental conditions for extreme load prediction on offshore structures in a reliability perspective
Ph.DDOCTOR OF PHILOSOPH
Current Topics on Risk Analysis: ICRA6 and RISK2015 Conference
Peer ReviewedPostprint (published version
Current Topics on Risk Analysis: ICRA6 and RISK2015 Conference
Artículos presentados en la International Conference on Risk Analysis ICRA 6/RISK
2015, celebrada en Barcelona del 26 al 29 de mayo de 2015.Peer ReviewedPostprint (published version