27,766 research outputs found

    Robust estimation of risks from small samples

    Get PDF
    Data-driven risk analysis involves the inference of probability distributions from measured or simulated data. In the case of a highly reliable system, such as the electricity grid, the amount of relevant data is often exceedingly limited, but the impact of estimation errors may be very large. This paper presents a robust nonparametric Bayesian method to infer possible underlying distributions. The method obtains rigorous error bounds even for small samples taken from ill-behaved distributions. The approach taken has a natural interpretation in terms of the intervals between ordered observations, where allocation of probability mass across intervals is well-specified, but the location of that mass within each interval is unconstrained. This formulation gives rise to a straightforward computational resampling method: Bayesian Interval Sampling. In a comparison with common alternative approaches, it is shown to satisfy strict error bounds even for ill-behaved distributions.Comment: 13 pages, 3 figures; supplementary information provided. A revised version of this manuscript has been accepted for publication in Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Science

    A General Framework for Updating Belief Distributions

    Full text link
    We propose a framework for general Bayesian inference. We argue that a valid update of a prior belief distribution to a posterior can be made for parameters which are connected to observations through a loss function rather than the traditional likelihood function, which is recovered under the special case of using self information loss. Modern application areas make it is increasingly challenging for Bayesians to attempt to model the true data generating mechanism. Moreover, when the object of interest is low dimensional, such as a mean or median, it is cumbersome to have to achieve this via a complete model for the whole data distribution. More importantly, there are settings where the parameter of interest does not directly index a family of density functions and thus the Bayesian approach to learning about such parameters is currently regarded as problematic. Our proposed framework uses loss-functions to connect information in the data to functionals of interest. The updating of beliefs then follows from a decision theoretic approach involving cumulative loss functions. Importantly, the procedure coincides with Bayesian updating when a true likelihood is known, yet provides coherent subjective inference in much more general settings. Connections to other inference frameworks are highlighted.Comment: This is the pre-peer reviewed version of the article "A General Framework for Updating Belief Distributions", which has been accepted for publication in the Journal of Statistical Society - Series B. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Self-Archivin

    Data and uncertainty in extreme risks - a nonlinear expectations approach

    Full text link
    Estimation of tail quantities, such as expected shortfall or Value at Risk, is a difficult problem. We show how the theory of nonlinear expectations, in particular the Data-robust expectation introduced in [5], can assist in the quantification of statistical uncertainty for these problems. However, when we are in a heavy-tailed context (in particular when our data are described by a Pareto distribution, as is common in much of extreme value theory), the theory of [5] is insufficient, and requires an additional regularization step which we introduce. By asking whether this regularization is possible, we obtain a qualitative requirement for reliable estimation of tail quantities and risk measures, in a Pareto setting

    Estimating Financial Risk Measures for Futures Positions:A Non-Parametric Approach

    Get PDF
    This paper presents non-parametric estimates of spectral risk measures applied to long and short positions in 5 prominent equity futures contracts. It also compares these to estimates of two popular alternative measures, the Value-at-Risk (VaR) and Expected Shortfall (ES). The spectral risk measures are conditioned on the coefficient of absolute risk aversion, and the latter two are conditioned on the confidence level. Our findings indicate that all risk measures increase dramatically and their estimators deteriorate in precision when their respective conditioning parameter increases. Results also suggest that estimates of spectral risk measures and their precision levels are of comparable orders of magnitude as those of more conventional risk measures. Running head: financial risk measures for futures positions.

    Regularizing Portfolio Optimization

    Get PDF
    The optimization of large portfolios displays an inherent instability to estimation error. This poses a fundamental problem, because solutions that are not stable under sample fluctuations may look optimal for a given sample, but are, in effect, very far from optimal with respect to the average risk. In this paper, we approach the problem from the point of view of statistical learning theory. The occurrence of the instability is intimately related to over-fitting which can be avoided using known regularization methods. We show how regularized portfolio optimization with the expected shortfall as a risk measure is related to support vector regression. The budget constraint dictates a modification. We present the resulting optimization problem and discuss the solution. The L2 norm of the weight vector is used as a regularizer, which corresponds to a diversification "pressure". This means that diversification, besides counteracting downward fluctuations in some assets by upward fluctuations in others, is also crucial because it improves the stability of the solution. The approach we provide here allows for the simultaneous treatment of optimization and diversification in one framework that enables the investor to trade-off between the two, depending on the size of the available data set

    ISIPTA'07: Proceedings of the Fifth International Symposium on Imprecise Probability: Theories and Applications

    Get PDF
    B

    Extreme Measures of Agricultural Financial Risk

    Get PDF
    This paper is from the Centre for Financial Markets (CFM) Working Paper series at University College Dublin.Agricultural financial risk, Spectral risk measures, Expected Shortfall, Value at Risk, Extreme Value Theory., Agricultural Finance, Risk and Uncertainty, E17, G19, N52,

    There is a VaR beyond usual approximations

    Get PDF
    Basel II and Solvency 2 both use the Value-at-Risk (VaR) as the risk measure to compute the Capital Requirements. In practice, to calibrate the VaR, a normal approximation is often chosen for the unknown distribution of the yearly log returns of financial assets. This is usually justified by the use of the Central Limit Theorem (CLT), when assuming aggregation of independent and identically distributed (iid) observations in the portfolio model. Such a choice of modeling, in particular using light tail distributions, has proven during the crisis of 2008/2009 to be an inadequate approximation when dealing with the presence of extreme returns; as a consequence, it leads to a gross underestimation of the risks. The main objective of our study is to obtain the most accurate evaluations of the aggregated risks distribution and risk measures when working on financial or insurance data under the presence of heavy tail and to provide practical solutions for accurately estimating high quantiles of aggregated risks. We explore a new method, called Normex, to handle this problem numerically as well as theoretically, based on properties of upper order statistics. Normex provides accurate results, only weakly dependent upon the sample size and the tail index. We compare it with existing methods.Comment: 33 pages, 5 figure
    corecore