342 research outputs found

    Expectations of linear functions with respect to truncazted multinormal distributions, with applications for uncertainty analysis in environmental modelling

    Get PDF
    Uncertainty can hamper the stringency of commitments under cap and trade schemes. We assess how well intensity targets, where countries' permit allocations are indexed to future realised GDP, can cope with uncertainties in a post-Kyoto international greenhouse emissions trading scheme. We present some empirical foundations for intensity targets and derive a simple rule for the optimal degree of indexation to GDP. Using an 18-region simulation model of a 2020 global cap-and-trade treaty under multiple uncertainties and endogenous commitments, we estimate that optimal intensity targets could achieve global abatement as much as 20 per cent higher than under absolute targets, and even greater increases in welfare measures. The optimal degree of indexation to GDP would vary greatly between countries, including super-indexation in some advanced countries, and partial indexation for most developing countries. Standard intensity targets (with one-toone indexation) would also improve the overall outcome, but to a lesser degree and not in all cases. Although target indexation is no magic wand for a future global climate treaty, gains from reduced cost uncertainty might justify increased complexity, framing issues and other potential downsides of intensity targets.linear functions, truncazted multinormal distributions, uncertainty analysis, environmental modelling

    Risk programming analysis with imperfect information

    Get PDF
    A Monte Carlo procedure is used to demonstrate the dangers of basing (farm) risk programming on only a few states of nature and to study the impact of applying alternative risk programming methods. Two risk programming formulations are considered, namely mean-variance (E,V) programming and utility efficient (UE) programming. For the particular example of a Norwegian mixed livestock and crop farm, the programming solution is unstable with few states, although the cost of picking a sub-optimal plan declines with increases in number of states. Comparing the E,V results with the UE results shows that there were few discrepancies between the two and the differences which do occur are mainly trivial, thus both methods gave unreliable results in cases with small samples

    Expectations of linear functions with respect to truncated multinormal distributions, with applications for uncertainty analysis in environmental modelling

    No full text
    This paper discusses results concerning multivariate normal distributions that are subject to truncation by a hyperplane and how such results can be applied to uncertainty analysis in the environmental sciences. We present a suite of results concerning truncated multivariate normal distributions, some of which already appear in the mathematical literature. The focus here is to make these types of results more accesible to the environmental science community and to this end we include a conceptually simple alternative derivation of an important result. We illustrate how the theory of truncated multivariate normal distributions can be employed in the environmental sciences by means of an example from the economics of climate change control

    Asymptotic crossing rates for stationary Gaussian vector processes

    Get PDF
    AbstractFor stationary differentiable Gaussian vector processes the expected number of crossings through a hypersurface is given by a surface integral. In general, this is difficult to calculate. In this paper asymptotic approximations for these surface integrals are derived

    Genz and Mendell-Elston Estimation of the High-Dimensional Multivariate Normal Distribution

    Get PDF
    Statistical analysis of multinomial data in complex datasets often requires estimation of the multivariate normal (MVN) distribution for models in which the dimensionality can easily reach 10–1000 and higher. Few algorithms for estimating the MVN distribution can offer robust and efficient performance over such a range of dimensions. We report a simulation-based comparison of two algorithms for the MVN that are widely used in statistical genetic applications. The venerable Mendell- Elston approximation is fast but execution time increases rapidly with the number of dimensions, estimates are generally biased, and an error bound is lacking. The correlation between variables significantly affects absolute error but not overall execution time. The Monte Carlo-based approach described by Genz returns unbiased and error-bounded estimates, but execution time is more sensitive to the correlation between variables. For ultra-high-dimensional problems, however, the Genz algorithm exhibits better scale characteristics and greater time-weighted efficiency of estimation

    Calculating Value-at-Risk

    Get PDF
    The market risk of a portfolio refers to the possibility of financial loss due to the joint movement of systematic economic variables such as interest and exchange rates. Quantifying market risk is important to regulators in assessing solvency and to risk managers in allocating scarce capital. Moreover, market risk is often the central risk faced by financial institutions. The standard method for measuring market risk places a conservative, one-sided confidence interval on portfolio losses for short forecast horizons. This bound on losses is often called capital-at-risk or value-at-risk (VAR), for obvious reasons. Calculating the VAR or any similar risk metric requires a probability distribution of changes in portfolio value. In most risk management models, this distribution is derived by placing assumptions on (1) how the portfolio function is approximated, and (2) how the state variables are modeled. Using this framework, we first review four methods for measuring market risk. We then develop and illustrate two new market risk measurement models that use a second-order approximation to the portfolio function and a multivariate GARCH(l,1) model for the state variables. We show that when changes in the state variables are modeled as conditional or unconditional multivariate normal, first-order approximations to the portfolio function yield a univariate normal for the change in portfolio value while second-order approximations yield a quadratic normal. Using equity return data and a hypothetical portfolio of options, we then evaluate the performance of all six models by examining how accurately each calculates the VAR on an out-of-sample basis. We find that our most general model is superior to all others in predicting the VAR. In additional empirical tests focusing on the error contribution of each of the two model components, we find that the superior performance of our most general model is largely attributable to the use of the second-order approximation, and that the first-order approximations favored by practitioners perform quite poorly. Empirical evidence on the modeling of the state variables is mixed but supports usage of a model which reflects non-linearities in state variable return distributions. This paper was presented at the Financial Institutions Center's October 1996 conference on "

    Optimisation methods in structural systems reliability

    Get PDF
    Imperial Users onl
    corecore