189,828 research outputs found

    Forecasting Long-Term Government Bond Yields: An Application of Statistical and AI Models

    Get PDF
    This paper evaluates several artificial intelligence and classical algorithms on their ability of forecasting the monthly yield of the US 10-year Treasury bonds from a set of four economic indicators. Due to the complexity of the prediction problem, the task represents a challenging test for the algorithms under evaluation. At the same time, the study is of particular significance for the important and paradigmatic role played by the US market in the world economy. Four data-driven artificial intelligence approaches are considered, namely, a manually built fuzzy logic model, a machine learned fuzzy logic model, a self-organising map model and a multi-layer perceptron model. Their performance is compared with the performance of two classical approaches, namely, a statistical ARIMA model and an econometric error correction model. The algorithms are evaluated on a complete series of end-month US 10-year Treasury bonds yields and economic indicators from 1986:1 to 2004:12. In terms of prediction accuracy and reliability of the modelling procedure, the best results are obtained by the three parametric regression algorithms, namely the econometric, the statistical and the multi-layer perceptron model. Due to the sparseness of the learning data samples, the manual and the automatic fuzzy logic approaches fail to follow with adequate precision the range of variations of the US 10-year Treasury bonds. For similar reasons, the self-organising map model gives an unsatisfactory performance. Analysis of the results indicates that the econometric model has a slight edge over the statistical and the multi-layer perceptron models. This suggests that pure data-driven induction may not fully capture the complicated mechanisms ruling the changes in interest rates. Overall, the prediction accuracy of the best models is only marginally better than the prediction accuracy of a basic one-step lag predictor. This result highlights the difficulty of the modelling task and, in general, the difficulty of building reliable predictors for financial markets.interest rates; forecasting; neural networks; fuzzy logic.

    Landslide susceptibility assessment in Karanganyar regency - Indonesia - Comparison of knowledge-based and Data-driven Models

    Get PDF
    Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.Disaster management requires spatial information as a backbone of preparedness and mitigation process. In that context, an assessment of landslide susceptibility becomes essential in an area that is prone to landslide due to its geographical condition. The Tawangmangu, Jenawi and Ngargoyoso Subdistric in Karanganyar Regency is the one of such areas, and is the area most frequently hit by landslides in the Central Java Province of Indonesia. In this study, three different methods were applied to examine landslide susceptibility in that area: heuristic, statistical logistic regression and Artificial Neural Network (ANN). Heuristic method is a knowledge-based approach whereas the latter two are categorized as data-driven methods due to the involvement of landslide inventory in their analysis. Eight site-specific available and commonly used landslide influencing factors (slope, aspect, topographical shape, curvature, lithology, land use, distance to road and distance to river) were preprocessed in a GIS environment and then analyzed using statistical and GIS tools to understand the relationship and significance of each to landslide occurrence, and to generate landslide susceptibility maps. ILWIS, Idrisi and ArcGIS software were used to prepare the dataset and visualize the model while PASW was employed to run prediction models (logistic regression for statistical method and multi-layer perceptron for ANN). The study employed degree of fit and Receiving Operating Characteristic (ROC) to assess the models performance. The region was mapped into five landslide susceptibility classes: very low, low, moderate, high and very high class. The results also showed that lithology, land use and topographical are the three most influential factors (i.e., significant in controlling the landslide to take place). According to degree of fit analysis applied to all models, ANN performed better than the other models when predicting landslide susceptibility of the study area. Meanwhile, according to ROC analysis applied to data-driven methods, ANN shows better performance (AUC 0,988) than statistical logistic regression (AUC 0,959)

    Convergence and error analysis of PINNs

    Full text link
    Physics-informed neural networks (PINNs) are a promising approach that combines the power of neural networks with the interpretability of physical modeling. PINNs have shown good practical performance in solving partial differential equations (PDEs) and in hybrid modeling scenarios, where physical models enhance data-driven approaches. However, it is essential to establish their theoretical properties in order to fully understand their capabilities and limitations. In this study, we highlight that classical training of PINNs can suffer from systematic overfitting. This problem can be addressed by adding a ridge regularization to the empirical risk, which ensures that the resulting estimator is risk-consistent for both linear and nonlinear PDE systems. However, the strong convergence of PINNs to a solution satisfying the physical constraints requires a more involved analysis using tools from functional analysis and calculus of variations. In particular, for linear PDE systems, an implementable Sobolev-type regularization allows to reconstruct a solution that not only achieves statistical accuracy but also maintains consistency with the underlying physics

    Identifying Structure Transitions Using Machine Learning Methods

    Get PDF
    Methodologies from data science and machine learning, both new and old, provide an exciting opportunity to investigate physical systems using extremely expressive statistical modeling techniques. Physical transitions are of particular interest, as they are accompanied by pattern changes in the configurations of the systems. Detecting and characterizing pattern changes in data happens to be a particular strength of statistical modeling in data science, especially with the highly expressive and flexible neural network models that have become increasingly computationally accessible in recent years through performance improvements in both hardware and algorithmic implementations. Conceptually, the machine learning approach can be regarded as one that employing algorithms that eschew explicit instructions in favor of strategies based around pattern extraction and inference driven by statistical analysis and large complex data sets. This allows for the investigation of physical systems using only raw configurational information to make inferences instead of relying on physical information obtained from a priori knowledge of the system. This work focuses on the extraction of useful compressed representations of physical configurations from systems of interest to automate phase classification tasks in addition to the identification of critical points and crossover regions

    Drilling Performance Monitoring and Optimization: A Data-driven Approach

    Get PDF
    Abstract Drilling performance monitoring and optimization are crucial in increasing the overall NPV of an oil and gas project. Even after rigorous planning, drilling phase of any project can be hindered by unanticipated problems, such as bit balling. The objective of this paper is to implement artifcial intelligence technique to develop a smart model for more accurate and robust real-time drilling performance monitoring and optimization. For this purpose, the back propagation, feed forward neural network model was developed to predict rate of penetration (ROP) using diferent input parameters such as weight on bit, rotations per minute, mud fow (GPM) and diferential pressures. The heavy hitter features identifcation and dimensionality reduction are performed to understand the impacts of each of the drilling parameters on ROP. This will be used to optimize the input parameters for model development and validation and performing the operation optimization when bit is underperforming. The model is frst developed based on the drilling experiments performed in the laboratory and then extended to feld applications. From both laboratory and feld test data provided, we have proved that the data-driven model built using multilayer perceptron technique can be successfully used for drilling performance monitoring and optimization, especially identifying the bit malfunction or failure, i.e., bit balling. We have shown that the ROP has complex relationship with other drilling variables which cannot be captured using conventional statistical approaches or from diferent empirical models. The data-driven approach combined with statistical regression analysis provides better understanding of relationship between variables and prediction of ROP

    Data-driven Soft Sensors in the Process Industry

    Get PDF
    In the last two decades Soft Sensors established themselves as a valuable alternative to the traditional means for the acquisition of critical process variables, process monitoring and other tasks which are related to process control. This paper discusses characteristics of the process industry data which are critical for the development of data-driven Soft Sensors. These characteristics are common to a large number of process industry fields, like the chemical industry, bioprocess industry, steel industry, etc. The focus of this work is put on the data-driven Soft Sensors because of their growing popularity, already demonstrated usefulness and huge, though yet not completely realised, potential. A comprehensive selection of case studies covering the three most important Soft Sensor application fields, a general introduction to the most popular Soft Sensor modelling techniques as well as a discussion of some open issues in the Soft Sensor development and maintenance and their possible solutions are the main contributions of this work

    A generative adversarial network approach to calibration of local stochastic volatility models

    Full text link
    We propose a fully data-driven approach to calibrate local stochastic volatility (LSV) models, circumventing in particular the ad hoc interpolation of the volatility surface. To achieve this, we parametrize the leverage function by a family of feed-forward neural networks and learn their parameters directly from the available market option prices. This should be seen in the context of neural SDEs and (causal) generative adversarial networks: we generate volatility surfaces by specific neural SDEs, whose quality is assessed by quantifying, possibly in an adversarial manner, distances to market prices. The minimization of the calibration functional relies strongly on a variance reduction technique based on hedging and deep hedging, which is interesting in its own right: it allows the calculation of model prices and model implied volatilities in an accurate way using only small sets of sample paths. For numerical illustration we implement a SABR-type LSV model and conduct a thorough statistical performance analysis on many samples of implied volatility smiles, showing the accuracy and stability of the method.Comment: Replacement for previous version: Major update of previous version to match the content of the published versio
    corecore