5,656 research outputs found

    Advanced Numerical Modeling in Manufacturing Processes

    Get PDF
    In manufacturing applications, a large number of data can be collected by experimental studies and/or sensors. This collected data is vital to improving process efficiency, scheduling maintenance activities, and predicting target variables. This dissertation explores a wide range of numerical modeling techniques that use data for manufacturing applications. Ignorance of uncertainty and the physical principle of a system are shortcomings of the existing methods. Besides, different methods are proposed to overcome the shortcomings by incorporating uncertainty and physics-based knowledge. In the first part of this dissertation, artificial neural networks (ANNs) are applied to develop a functional relationship between input and target variables and process parameter optimization. The second part evaluates the robust response surface optimization (RRSO) to quantify different sources of uncertainty in numerical analysis. Additionally, a framework based on the Bayesian network (BN) approach is proposed to support decision-making. Due to various uncertainties, estimating interval and probability distribution are often more helpful than deterministic point value estimation. Thus, the Monte Carlo (MC) dropout-based interval prediction technique is explored in the third part of this dissertation. A conservative interval prediction technique for the linear and polynomial regression model is also developed using linear optimization. Applications of different data-driven methods in manufacturing are useful to analyze situations, gain insights, and make essential decisions. But, the prediction by data-driven methods may be physically inconsistent. Thus, in the fourth part of this dissertation, a physics-informed machine learning (PIML) technique is proposed to incorporate physics-based knowledge with collected data for improving prediction accuracy and generating physically consistent outcomes. Each numerical analysis section is presented with case studies that involve conventional or additive manufacturing applications. Based on various case studies carried out, it can be concluded that advanced numerical modeling methods are essential to be incorporated in manufacturing applications to gain advantages in the era of Industry 4.0 and Industry 5.0. Although the case study for the advanced numerical modeling proposed in this dissertation is only presented in manufacturing-related applications, the methods presented in this dissertation is not exhaustive to manufacturing application and can also be expanded to other data-driven engineering and system applications

    A combined model of statistical downscaling and latent process multivariable spatial modeling of precipitation extremes

    Get PDF
    Future projections of extreme precipitation can help engineers and scientists with infrastructure design projects and risk assessment studies. Extreme events are usually represented as return levels which are equivalent to upper percentiles of an extreme value distribution, such as the Generalized Pareto distribution, which is used for exceedances above a certain threshold. My dissertation focus is on uncertainty quantification related to estimation of future return levels for precipitation at the local (weather station) to regional level. Variance reduction is achieved through spatial modeling and optimally combining suites of climate model outputs. The main contribution is a unified statistical model that combines the variance reduction methods with a latent model statistical downscaling technique. The dissertation is presented in three chapters: (I) Single-Location Bayesian Estimation of Generalized Pareto Distribution (GPD); (II) Multiple-Location Bayesian Estimation of GPD with a Spatial Latent Process. (III) Spatial Combining of Multiple Climate Model Outputs and Downscaling for Projections of Future Extreme Precipitation

    Efficient Prediction Designs for Random Fields

    Full text link
    For estimation and predictions of random fields it is increasingly acknowledged that the kriging variance may be a poor representative of true uncertainty. Experimental designs based on more elaborate criteria that are appropriate for empirical kriging are then often non-space-filling and very costly to determine. In this paper, we investigate the possibility of using a compound criterion inspired by an equivalence theorem type relation to build designs quasi-optimal for the empirical kriging variance, when space-filling designs become unsuitable. Two algorithms are proposed, one relying on stochastic optimization to explicitly identify the Pareto front, while the second uses the surrogate criteria as local heuristic to chose the points at which the (costly) true Empirical Kriging variance is effectively computed. We illustrate the performance of the algorithms presented on both a simple simulated example and a real oceanographic dataset

    Novel sampling techniques for reservoir history matching optimisation and uncertainty quantification in flow prediction

    Get PDF
    Modern reservoir management has an increasing focus on accurately predicting the likely range of field recoveries. A variety of assisted history matching techniques has been developed across the research community concerned with this topic. These techniques are based on obtaining multiple models that closely reproduce the historical flow behaviour of a reservoir. The set of resulted history matched models is then used to quantify uncertainty in predicting the future performance of the reservoir and providing economic evaluations for different field development strategies. The key step in this workflow is to employ algorithms that sample the parameter space in an efficient but appropriate manner. The algorithm choice has an impact on how fast a model is obtained and how well the model fits the production data. The sampling techniques that have been developed to date include, among others, gradient based methods, evolutionary algorithms, and ensemble Kalman filter (EnKF). This thesis has investigated and further developed the following sampling and inference techniques: Particle Swarm Optimisation (PSO), Hamiltonian Monte Carlo, and Population Markov Chain Monte Carlo. The inspected techniques have the capability of navigating the parameter space and producing history matched models that can be used to quantify the uncertainty in the forecasts in a faster and more reliable way. The analysis of these techniques, compared with Neighbourhood Algorithm (NA), has shown how the different techniques affect the predicted recovery from petroleum systems and the benefits of the developed methods over the NA. The history matching problem is multi-objective in nature, with the production data possibly consisting of multiple types, coming from different wells, and collected at different times. Multiple objectives can be constructed from these data and explicitly be optimised in the multi-objective scheme. The thesis has extended the PSO to handle multi-objective history matching problems in which a number of possible conflicting objectives must be satisfied simultaneously. The benefits and efficiency of innovative multi-objective particle swarm scheme (MOPSO) are demonstrated for synthetic reservoirs. It is demonstrated that the MOPSO procedure can provide a substantial improvement in finding a diverse set of good fitting models with a fewer number of very costly forward simulations runs than the standard single objective case, depending on how the objectives are constructed. The thesis has also shown how to tackle a large number of unknown parameters through the coupling of high performance global optimisation algorithms, such as PSO, with model reduction techniques such as kernel principal component analysis (PCA), for parameterising spatially correlated random fields. The results of the PSO-PCA coupling applied to a recent SPE benchmark history matching problem have demonstrated that the approach is indeed applicable for practical problems. A comparison of PSO with the EnKF data assimilation method has been carried out and has concluded that both methods have obtained comparable results on the example case. This point reinforces the need for using a range of assisted history matching algorithms for more confidence in predictions
    corecore