9,350 research outputs found

    Data analytics 2016: proceedings of the fifth international conference on data analytics

    Get PDF

    A Deep Learning Approach for Dynamic Balance Sheet Stress Testing

    Full text link
    In the aftermath of the financial crisis, supervisory authorities have considerably improved their approaches in performing financial stress testing. However, they have received significant criticism by the market participants due to the methodological assumptions and simplifications employed, which are considered as not accurately reflecting real conditions. First and foremost, current stress testing methodologies attempt to simulate the risks underlying a financial institution's balance sheet by using several satellite models, making their integration a really challenging task with significant estimation errors. Secondly, they still suffer from not employing advanced statistical techniques, like machine learning, which capture better the nonlinear nature of adverse shocks. Finally, the static balance sheet assumption, that is often employed, implies that the management of a bank passively monitors the realization of the adverse scenario, but does nothing to mitigate its impact. To address the above mentioned criticism, we introduce in this study a novel approach utilizing deep learning approach for dynamic balance sheet stress testing. Experimental results give strong evidence that deep learning applied in big financial/supervisory datasets create a state of the art paradigm, which is capable of simulating real world scenarios in a more efficient way.Comment: Preprint submitted to Journal of Forecastin

    Generative Adversarial Networks for Financial Trading Strategies Fine-Tuning and Combination

    Get PDF
    Systematic trading strategies are algorithmic procedures that allocate assets aiming to optimize a certain performance criterion. To obtain an edge in a highly competitive environment, the analyst needs to proper fine-tune its strategy, or discover how to combine weak signals in novel alpha creating manners. Both aspects, namely fine-tuning and combination, have been extensively researched using several methods, but emerging techniques such as Generative Adversarial Networks can have an impact into such aspects. Therefore, our work proposes the use of Conditional Generative Adversarial Networks (cGANs) for trading strategies calibration and aggregation. To this purpose, we provide a full methodology on: (i) the training and selection of a cGAN for time series data; (ii) how each sample is used for strategies calibration; and (iii) how all generated samples can be used for ensemble modelling. To provide evidence that our approach is well grounded, we have designed an experiment with multiple trading strategies, encompassing 579 assets. We compared cGAN with an ensemble scheme and model validation methods, both suited for time series. Our results suggest that cGANs are a suitable alternative for strategies calibration and combination, providing outperformance when the traditional techniques fail to generate any alpha

    Long-Term Load Forecasting Considering Volatility Using Multiplicative Error Model

    Full text link
    Long-term load forecasting plays a vital role for utilities and planners in terms of grid development and expansion planning. An overestimate of long-term electricity load will result in substantial wasted investment in the construction of excess power facilities, while an underestimate of future load will result in insufficient generation and unmet demand. This paper presents first-of-its-kind approach to use multiplicative error model (MEM) in forecasting load for long-term horizon. MEM originates from the structure of autoregressive conditional heteroscedasticity (ARCH) model where conditional variance is dynamically parameterized and it multiplicatively interacts with an innovation term of time-series. Historical load data, accessed from a U.S. regional transmission operator, and recession data for years 1993-2016 is used in this study. The superiority of considering volatility is proven by out-of-sample forecast results as well as directional accuracy during the great economic recession of 2008. To incorporate future volatility, backtesting of MEM model is performed. Two performance indicators used to assess the proposed model are mean absolute percentage error (for both in-sample model fit and out-of-sample forecasts) and directional accuracy.Comment: 19 pages, 11 figures, 3 table

    Best practices to improve maritime safety in the Gulf of Finland : a risk governance approach

    Get PDF
    The Gulf of Finland of the Baltic Sea is a vulnerable sea area with high volumes of maritime traffic and difficult navigation conditions. The reactive international rules are not anymore regarded adequate in ensuring safety in this sea area. In this paper, a regional proactive risk governance approach is suggested for improving the effectiveness of safety policy formulation and management in the Gulf of Finland, based on the risk governance framework developed by the International Risk Governance Council (IRGC), the Formal Safety Assessment approach adopted by the International Maritime Safety Organisation (IMO), and best practices sought from other sectors and sea areas. The approach is based on a formal process of identifying, assessing and evaluating accident risks at the regional level, and adjusting policies or management practices before accidents occur. The proposed approach sees maritime safety as a holistic system, and manages it by combining a scientific risk assessment with stakeholder input to identify risks and risk control options, and to evaluate risks. A regional proactive approach can improve safety by focusing on actual risks, by designing tailor-made safety measures to control them, by enhancing a positive safety culture in the shipping industry, and by increasing trust among all involved.Non peer reviewe

    Uncertainty-Aware Workload Prediction in Cloud Computing

    Full text link
    Predicting future resource demand in Cloud Computing is essential for managing Cloud data centres and guaranteeing customers a minimum Quality of Service (QoS) level. Modelling the uncertainty of future demand improves the quality of the prediction and reduces the waste due to overallocation. In this paper, we propose univariate and bivariate Bayesian deep learning models to predict the distribution of future resource demand and its uncertainty. We design different training scenarios to train these models, where each procedure is a different combination of pretraining and fine-tuning steps on multiple datasets configurations. We also compare the bivariate model to its univariate counterpart training with one or more datasets to investigate how different components affect the accuracy of the prediction and impact the QoS. Finally, we investigate whether our models have transfer learning capabilities. Extensive experiments show that pretraining with multiple datasets boosts performances while fine-tuning does not. Our models generalise well on related but unseen time series, proving transfer learning capabilities. Runtime performance analysis shows that the models are deployable in real-world applications. For this study, we preprocessed twelve datasets from real-world traces in a consistent and detailed way and made them available to facilitate the research in this field
    • …
    corecore