106 research outputs found

    Differential 2D Copula Approximating Transforms via Sobolev Training: 2-Cats Networks

    Full text link
    Copulas are a powerful statistical tool that captures dependencies across data dimensions. When applying Copulas, we can estimate multivariate distribution functions by initially estimating independent marginals, an easy task, and then a single copulating function, CC, to connect the marginals, a hard task. For two-dimensional data, a copula is a two-increasing function of the form C:(u,v)I2IC: (u,v)\in \mathbf{I}^2 \rightarrow \mathbf{I}, where I=[0,1]\mathbf{I} = [0, 1]. In this paper, we show how Neural Networks (NNs) can approximate any two-dimensional copula non-parametrically. Our approach, denoted as 2-Cats, is inspired by the Physics-Informed Neural Networks and Sobolev Training literature. Not only do we show that we can estimate the output of a 2d Copula better than the state-of-the-art, our approach is non-parametric and respects the mathematical properties of a Copula CC

    Bayesian Network Approach to Assessing System Reliability for Improving System Design and Optimizing System Maintenance

    Get PDF
    abstract: A quantitative analysis of a system that has a complex reliability structure always involves considerable challenges. This dissertation mainly addresses uncertainty in- herent in complicated reliability structures that may cause unexpected and undesired results. The reliability structure uncertainty cannot be handled by the traditional relia- bility analysis tools such as Fault Tree and Reliability Block Diagram due to their deterministic Boolean logic. Therefore, I employ Bayesian network that provides a flexible modeling method for building a multivariate distribution. By representing a system reliability structure as a joint distribution, the uncertainty and correlations existing between system’s elements can effectively be modeled in a probabilistic man- ner. This dissertation focuses on analyzing system reliability for the entire system life cycle, particularly, production stage and early design stages. In production stage, the research investigates a system that is continuously mon- itored by on-board sensors. With modeling the complex reliability structure by Bayesian network integrated with various stochastic processes, I propose several methodologies that evaluate system reliability on real-time basis and optimize main- tenance schedules. In early design stages, the research aims to predict system reliability based on the current system design and to improve the design if necessary. The three main challenges in this research are: 1) the lack of field failure data, 2) the complex reliability structure and 3) how to effectively improve the design. To tackle the difficulties, I present several modeling approaches using Bayesian inference and nonparametric Bayesian network where the system is explicitly analyzed through the sensitivity analysis. In addition, this modeling approach is enhanced by incorporating a temporal dimension. However, the nonparametric Bayesian network approach generally accompanies with high computational efforts, especially, when a complex and large system is modeled. To alleviate this computational burden, I also suggest to building a surrogate model with quantile regression. In summary, this dissertation studies and explores the use of Bayesian network in analyzing complex systems. All proposed methodologies are demonstrated by case studies.Dissertation/ThesisDoctoral Dissertation Industrial Engineering 201

    Optimal Currency Composition for China’s Foreign Reserves: a Copula Approach

    Get PDF
    This paper investigates the optimal currency composition for a country's foreign reserves. In the context of China, we examine the asymmetric, fat-tail and complex dependence structure in distributions of currency returns. A skewed, fat-tailed and pair-copula construction is then built to capture features of higher moments. In a D-vine copula approach, we show that under the disappointment aversion effect, the central bank in our model can achieve sizeable gains in expected economic value from switching from the mean-variance to copula modelling. We find that this approach will lead to an optimal currency composition that allows China to have more space for international currency diversification while maintaining the leading position of the US dollar in the currency shares of China's reserves

    Mean-CVAR portfolio : a mixture-copula approach

    Get PDF
    The current study aims to conduct a comparative analysis of portfolio optimization techniques in the context of financial applications. The proposed approach involves the use of mixture-copulas as an alternative to mitigate the inherent risks of investments associated with investments in the stock market, particularly during times of economic crisis. To conduct this research, data from 19 country index ETFs were sourced from Historical Market Data - Stooq, spanning the period from 2013 to 2023. The study employed MeanCVaR portfolio optimization, and the structural dependence between assets was modeled using a mixture of copulas (specifically Clayton-t-Gumbel), with marginal adjusted by an AR(1)-GARCH(1,1) model. The results of simulations based on this strategy were compared with two other benchmark portfolios, including those using Gaussian copulas and equally weighted portfolios, across three distinct time horizons: one, two, and five years. Portfolios generated through simulated returns using the mixture-copulas technique demonstrated superior risk-return performance when contrasted with the benchmark portfolios. Simultaneously, a reduction in financial losses was observed, with equivalent or superior returns in the comparison, particularly over longer time periods where the estimates were more accurate.A presente pesquisa tem como objetivo analisar comparativamente técnicas de otimização de portfólio no contexto de aplicações financeiras. A abordagem de mistura de cópulas é proposta como uma alternativa para mitigar os riscos inerentes aos investimentos realizados na Bolsa de Valores, principalmente em momentos de crise. Para desenvolver a pesquisa, foram utilizados dados de preços de 19 ETFs de índices de países, provenientes do Historical Market Data - Stooq, que abrangem o período de 2013 a 2023. Foi empregada uma otimização de portfólio Média-CVaR, e a dependência estrutural entre os ativos foi modelada usando uma mistura de cópulas (Clayton-t-Gumbel), cujas marginais foram ajustadas por um modelo AR(1)-GARCH(1,1). Os resultados das simulações feitas a partir dessa estratégia foram comparados com outros dois portfólios de referência de técnicas mais simples, usando cópulas Gaussianas e igualmente ponderado, em três janelas de tempo: um, dois e cinco anos. As carteiras geradas a partir dos retornos simulados com a técnica de mistura de cópulas apresentaram melhores desempenhos em termos de risco-retorno quando comparada aos portfólios de referência. Ao mesmo tempo, notou-se uma redução das perdas financeiras, inclusive retornos iguais ou superiores na comparação, especialmente nas janelas de tempo maiores, nas quais as estimativas foram mais precisas

    Generating weather for climate impact assessment on lakes

    Get PDF

    Statistical and stochastic post-processing of regional climate model data: copula-based downscaling, disaggregation and multivariate bias correction

    Get PDF
    In order to delineate management or climate change adaptation strategies for natural or technical water bodies, impact studies are necessary. To this end, impact models are set up for a given region which requires time series of meteorological data as driving data. Regional climate models (RCMs) are capable of simulating gridded data sets of several meteorological variables. The advantages over observed data are that the time series are complete and that meteorological information is also provided for ungauged locations. Furthermore, climate change impact studies can be conducted by driving the simulations with different forcing variables for future periods. While the performance of RCMs generally increases with a higher spatio-temporal resolution, the computational and storage demand increases non-linearly which can impede such highly resolved simulations in practice. Furthermore, systematic biases of the univariate distributions and multivariate dependence structures are a common problem of RCM simulations on all spatio-temporal scales. Depending on the case study, meteorological data must fulfill different criteria. For instance, the spatio-temporal resolution of precipitation time series should be as fine as 1 km and 5 minutes in order to be used for urban hydrological impact models. To bridge the gap between the demands of impact modelers and available meteorological RCM data, different computationally efficient statistical and stochastic post-processing techniques have been developed to correct the bias and to increase the spatio-temporal resolution. The main meteorological variable treated in this thesis is precipitation due to its importance for hydrological impact studies. The models presented in this thesis belong to the classes of bias correction, downscaling and temporal disaggregation techniques. The focus of the developed methods lies on multivariate copulas. Copulas constitute a promising modeling approach for highly-skewed and mixed discrete-continuous variables like precipitation since the marginal distribution is treated separately from the dependence structure. This feature makes them useful for the modeling of different meteorological variables as well. While copulas have been utilized in the past to model precipitation and other meteorological variables that are relevant in hydrology, applications to RCM simulations are not very common. The first method is a geostatistical estimation technique for distribution parameters of daily precipitation for ungauged locations, so that a bias correction with Quantile Mapping can be performed. The second method is a spatial downscaling of coarse scale RCM precipitation fields to a finer resolved domain. The model is based on the Gaussian Copula and generates ensembles of daily precipitation fields that resemble the precipitation fields of fine scale RCM simulations. The third method disaggregates hourly precipitation time series simulated by an RCM to a resolution of 5 minutes. The Gaussian Copula was utilized to condition the simulation on both spatial and temporal precipitation amounts to respect the spatio-temporal dependence structure. The fourth method is an approach to simulate a meteorological variable conditional on other variables at the same location and time step. The method was developed to improve the inter-variable dependence structure of univariately bias corrected RCM simulations in an hourly resolution

    Modelling bid-ask spread conditional distributions using hierarchical correlation reconstruction

    Get PDF
    While we would like to predict exact values, the information available, being incomplete, is rarely sufficient - usually allowing only conditional probability distributions to be predicted. This article discusses hierarchical correlation reconstruction (HCR) methodology for such a prediction using the example of bid-ask spreads (usually unavailable), but here predicted from more accessible data like closing price, volume, high/low price and returns. Using HCR methodology, as in copula theory, we first normalized marginal distributions so that they were nearly uniform. Then we modelled joint densities as linear combinations of orthonormal polynomials, obtaining their decomposition into mixed moments. Then we modelled each moment of the predicted variable separately as a linear combination of mixed moments of known variables using least squares linear regression. By combining these predicted moments, we obtained the predicted density as a polynomial, for which we can e.g. calculate the expected value, but also the variance to determine the uncertainty of the prediction, or we can use the entire distribution for, e.g. more accurate further calculations or generating random values. 10-fold cross-validation log-likelihood tests were conducted for 22 DAX companies, leading to very accurate predictions, especially when individual models were used for each company, as significant differences were found between their behaviours. An additional advantage of using this methodology is that it is computationally inexpensive; estimating and evaluating a model with hundreds of parameters and thousands of data points by means of this methodology takes only a second on a computer

    Sustainability Analysis and Environmental Decision-Making Using Simulation, Optimization, and Computational Analytics

    Get PDF
    Effective environmental decision-making is often challenging and complex, where final solutions frequently possess inherently subjective political and socio-economic components. Consequently, complex sustainability applications in the “real world” frequently employ computational decision-making approaches to construct solutions to problems containing numerous quantitative dimensions and considerable sources of uncertainty. This volume includes a number of such applied computational analytics papers that either create new decision-making methods or provide innovative implementations of existing methods for addressing a wide spectrum of sustainability applications, broadly defined. The disparate contributions all emphasize novel approaches of computational analytics as applied to environmental decision-making and sustainability analysis – be this on the side of optimization, simulation, modelling, computational solution procedures, visual analytics, and/or information technologies
    corecore