140,790 research outputs found
Open TURNS: An industrial software for uncertainty quantification in simulation
The needs to assess robust performances for complex systems and to answer
tighter regulatory processes (security, safety, environmental control, and
health impacts, etc.) have led to the emergence of a new industrial simulation
challenge: to take uncertainties into account when dealing with complex
numerical simulation frameworks. Therefore, a generic methodology has emerged
from the joint effort of several industrial companies and academic
institutions. EDF R&D, Airbus Group and Phimeca Engineering started a
collaboration at the beginning of 2005, joined by IMACS in 2014, for the
development of an Open Source software platform dedicated to uncertainty
propagation by probabilistic methods, named OpenTURNS for Open source Treatment
of Uncertainty, Risk 'N Statistics. OpenTURNS addresses the specific industrial
challenges attached to uncertainties, which are transparency, genericity,
modularity and multi-accessibility. This paper focuses on OpenTURNS and
presents its main features: openTURNS is an open source software under the LGPL
license, that presents itself as a C++ library and a Python TUI, and which
works under Linux and Windows environment. All the methodological tools are
described in the different sections of this paper: uncertainty quantification,
uncertainty propagation, sensitivity analysis and metamodeling. A section also
explains the generic wrappers way to link openTURNS to any external code. The
paper illustrates as much as possible the methodological tools on an
educational example that simulates the height of a river and compares it to the
height of a dyke that protects industrial facilities. At last, it gives an
overview of the main developments planned for the next few years
Dynamic dependence networks: Financial time series forecasting and portfolio decisions (with discussion)
We discuss Bayesian forecasting of increasingly high-dimensional time series,
a key area of application of stochastic dynamic models in the financial
industry and allied areas of business. Novel state-space models characterizing
sparse patterns of dependence among multiple time series extend existing
multivariate volatility models to enable scaling to higher numbers of
individual time series. The theory of these "dynamic dependence network" models
shows how the individual series can be "decoupled" for sequential analysis, and
then "recoupled" for applied forecasting and decision analysis. Decoupling
allows fast, efficient analysis of each of the series in individual univariate
models that are linked-- for later recoupling-- through a theoretical
multivariate volatility structure defined by a sparse underlying graphical
model. Computational advances are especially significant in connection with
model uncertainty about the sparsity patterns among series that define this
graphical model; Bayesian model averaging using discounting of historical
information builds substantially on this computational advance. An extensive,
detailed case study showcases the use of these models, and the improvements in
forecasting and financial portfolio investment decisions that are achievable.
Using a long series of daily international currency, stock indices and
commodity prices, the case study includes evaluations of multi-day forecasts
and Bayesian portfolio analysis with a variety of practical utility functions,
as well as comparisons against commodity trading advisor benchmarks.Comment: 31 pages, 9 figures, 3 table
Model Averaging in Risk Management with an Application to Futures Markets
This paper considers the problem of model uncertainty in the case of multi-asset volatility models and discusses the use of model averaging techniques as a way of dealing with the risk of inadvertently using false models in portfolio management. Evaluation of volatility models is then considered and a simple Value-at-Risk (VaR) diagnostic test is proposed for individual as well as ‘average ’ models. The asymptotic as well as the exact finite-sample distribution of the test statistic, dealing with the possibility of parameter uncertainty, are established. The model averaging idea and the VaR diagnostic tests are illustrated by an application to portfolios of daily returns on six currencies, four equity indices, four ten year government bonds and four commodities over the period 1991-2007. The empirical evidence supports the use of ‘thick’ model averaging strategies over single models or Bayesian type model averaging procedures
Polyhedral Predictive Regions For Power System Applications
Despite substantial improvement in the development of forecasting approaches,
conditional and dynamic uncertainty estimates ought to be accommodated in
decision-making in power system operation and market, in order to yield either
cost-optimal decisions in expectation, or decision with probabilistic
guarantees. The representation of uncertainty serves as an interface between
forecasting and decision-making problems, with different approaches handling
various objects and their parameterization as input. Following substantial
developments based on scenario-based stochastic methods, robust and
chance-constrained optimization approaches have gained increasing attention.
These often rely on polyhedra as a representation of the convex envelope of
uncertainty. In the work, we aim to bridge the gap between the probabilistic
forecasting literature and such optimization approaches by generating forecasts
in the form of polyhedra with probabilistic guarantees. For that, we see
polyhedra as parameterized objects under alternative definitions (under
and norms), the parameters of which may be modelled and predicted.
We additionally discuss assessing the predictive skill of such multivariate
probabilistic forecasts. An application and related empirical investigation
results allow us to verify probabilistic calibration and predictive skills of
our polyhedra.Comment: 8 page
Volatility forecasting
Volatility has been one of the most active and successful areas of research in time series econometrics and economic forecasting in recent decades. This chapter provides a selective survey of the most important theoretical developments and empirical insights to emerge from this burgeoning literature, with a distinct focus on forecasting applications. Volatility is inherently latent, and Section 1 begins with a brief intuitive account of various key volatility concepts. Section 2 then discusses a series of different economic situations in which volatility plays a crucial role, ranging from the use of volatility forecasts in portfolio allocation to density forecasting in risk management. Sections 3, 4 and 5 present a variety of alternative procedures for univariate volatility modeling and forecasting based on the GARCH, stochastic volatility and realized volatility paradigms, respectively. Section 6 extends the discussion to the multivariate problem of forecasting conditional covariances and correlations, and Section 7 discusses volatility forecast evaluation methods in both univariate and multivariate cases. Section 8 concludes briefly. JEL Klassifikation: C10, C53, G1
A review of applied methods in Europe for flood-frequency analysis in a changing environment
The report presents a review of methods used in Europe for trend analysis, climate change projections and non-stationary analysis of extreme precipitation and flood frequency. In addition, main findings of the analyses are presented, including a comparison of trend analysis results and climate change projections. Existing guidelines in Europe on design flood and design rainfall estimation that incorporate climate change are reviewed. The report
concludes with a discussion of research needs on non-stationary frequency analysis for considering the effects of climate change and inclusion in design guidelines.
Trend analyses are reported for 21 countries in Europe with results for extreme precipitation, extreme streamflow or both. A large number of national and regional trend studies have been carried out. Most studies are based on statistical methods applied to individual time series of extreme precipitation or extreme streamflow using the non-parametric Mann-Kendall trend test or regression analysis. Some studies have been reported that use field significance or regional consistency tests to analyse trends over larger areas. Some of the studies also include analysis of trend attribution. The studies reviewed indicate that there is
some evidence of a general increase in extreme precipitation, whereas there are no clear indications of significant increasing trends at regional or national level of extreme streamflow. For some smaller regions increases in extreme streamflow are reported. Several studies from regions dominated by snowmelt-induced peak flows report decreases in extreme streamflow and earlier spring snowmelt peak flows. Climate change projections have been reported for 14 countries in Europe with results for extreme precipitation, extreme streamflow or both. The review shows various approaches for producing climate projections of extreme precipitation and flood frequency based on
alternative climate forcing scenarios, climate projections from available global and regional climate models, methods for statistical downscaling and bias correction, and alternative hydrological models. A large number of the reported studies are based on an ensemble modelling approach that use several climate forcing scenarios and climate model projections in order to address the uncertainty on the projections of extreme precipitation and flood frequency. Some studies also include alternative statistical downscaling and bias correction methods and hydrological modelling approaches. Most studies reviewed indicate an increase in extreme precipitation under a future climate, which is consistent with the observed trend of extreme precipitation. Hydrological projections of peak flows and flood frequency show both positive and negative changes. Large increases in peak flows are reported for some catchments with rainfall-dominated peak flows, whereas a general decrease in flood magnitude and earlier spring floods are reported for catchments with snowmelt-dominated peak flows. The latter is consistent with the observed trends. The review of existing guidelines in Europe on design floods and design rainfalls shows that only few countries explicitly address climate change. These design guidelines are based on climate change adjustment factors to be applied to current design estimates and may
depend on design return period and projection horizon. The review indicates a gap between the need for considering climate change impacts in design and actual published guidelines that incorporate climate change in extreme precipitation and flood frequency. Most of the studies reported are based on frequency analysis assuming stationary conditions in a certain time window (typically 30 years) representing current and future climate. There is a need for developing more consistent non-stationary frequency analysis methods that can account for the transient nature of a changing climate
A Hierarchical Spatio-Temporal Statistical Model Motivated by Glaciology
In this paper, we extend and analyze a Bayesian hierarchical spatio-temporal
model for physical systems. A novelty is to model the discrepancy between the
output of a computer simulator for a physical process and the actual process
values with a multivariate random walk. For computational efficiency, linear
algebra for bandwidth limited matrices is utilized, and first-order emulator
inference allows for the fast emulation of a numerical partial differential
equation (PDE) solver. A test scenario from a physical system motivated by
glaciology is used to examine the speed and accuracy of the computational
methods used, in addition to the viability of modeling assumptions. We conclude
by discussing how the model and associated methodology can be applied in other
physical contexts besides glaciology.Comment: Revision accepted for publication by the Journal of Agricultural,
Biological, and Environmental Statistic
- …