461 research outputs found

    Reservoir Flooding Optimization by Control Polynomial Approximations

    Get PDF
    In this dissertation, we provide novel parametrization procedures for water-flooding production optimization problems, using polynomial approximation techniques. The methods project the original infinite dimensional controls space into a polynomial subspace. Our contribution includes new parameterization formulations using natural polynomials, orthogonal Chebyshev polynomials and Cubic spline interpolation. We show that the proposed methods are well suited for black-box approach with stochastic global-search method as they tend to produce smooth control trajectories, while reducing the solution space size. We demonstrate their efficiency on synthetic two-dimensional problems and on a realistic 3-dimensional problem. By contributing with a new adjoint method formulation for polynomial approximation, we implemented the methods also with gradient-based algorithms. In addition to fine-scale simulation, we also performed reduced order modeling, where we demonstrated a synergistic effect when combining polynomial approximation with model order reduction, that leads to faster optimization with higher gains in terms of Net Present Value. Finally, we performed gradient-based optimization under uncertainty. We proposed a new multi-objective function with three components, one that maximizes the expected value of all realizations, and two that maximize the averages of distribution tails from both sides. The new objective provides decision makers with the flexibility to choose the amount of risk they are willing to take, while deciding on production strategy or performing reserves estimation (P10;P50;P90)

    Phase-field modeling of brittle fracture with multi-level hp-FEM and the finite cell method

    Full text link
    The difficulties in dealing with discontinuities related to a sharp crack are overcome in the phase-field approach for fracture by modeling the crack as a diffusive object being described by a continuous field having high gradients. The discrete crack limit case is approached for a small length-scale parameter that controls the width of the transition region between the fully broken and the undamaged phases. From a computational standpoint, this necessitates fine meshes, at least locally, in order to accurately resolve the phase-field profile. In the classical approach, phase-field models are computed on a fixed mesh that is a priori refined in the areas where the crack is expected to propagate. This on the other hand curbs the convenience of using phase-field models for unknown crack paths and its ability to handle complex crack propagation patterns. In this work, we overcome this issue by employing the multi-level hp-refinement technique that enables a dynamically changing mesh which in turn allows the refinement to remain local at singularities and high gradients without problems of hanging nodes. Yet, in case of complex geometries, mesh generation and in particular local refinement becomes non-trivial. We address this issue by integrating a two-dimensional phase-field framework for brittle fracture with the finite cell method (FCM). The FCM based on high-order finite elements is a non-geometry-conforming discretization technique wherein the physical domain is embedded into a larger fictitious domain of simple geometry that can be easily discretized. This facilitates mesh generation for complex geometries and supports local refinement. Numerical examples including a comparison to a validation experiment illustrate the applicability of the multi-level hp-refinement and the FCM in the context of phase-field simulations

    Improved behavioral analysis of fuzzy cognitive map models

    Get PDF
    Fuzzy Cognitive Maps (FCMs) are widely applied for describing the major components of complex systems and their interconnections. The popularity of FCMs is mostly based on their simple system representation, easy model creation and usage, and its decision support capabilities. The preferable way of model construction is based on historical, measured data of the investigated system and a suitable learning technique. Such data are not always available, however. In these cases experts have to define the strength and direction of causal connections among the components of the system, and their decisions are unavoidably affected by more or less subjective elements. Unfortunately, even a small change in the estimated strength may lead to significantly different simulation outcome, which could pose significant decision risks. Therefore, the preliminary exploration of model ‘sensitivity’ to subtle weight modifications is very important to decision makers. This way their attention can be attracted to possible problems. This paper deals with the advanced version of a behavioral analysis. Based on the experiences of the authors, their method is further improved to generate more life-like, slightly modified model versions based on the original one suggested by experts. The details of the method is described, its application and the results are presented by an example of a banking application. The combination of Pareto-fronts and Bacterial Evolutionary Algorithm is a novelty of the approach. © Springer International Publishing AG, part of Springer Nature 2018.Peer reviewe

    Simulations and Modelling for Biological Invasions

    Get PDF
    Biological invasions are characterized by the movement of organisms from their native geographic region to new, distinct regions in which they may have significant impacts. Biological invasions pose one of the most serious threats to global biodiversity, and hence significant resources are invested in predicting, preventing, and managing them. Biological systems and processes are typically large, complex, and inherently difficult to study naturally because of their immense scale and complexity. Hence, computational modelling and simulation approaches can be taken to study them. In this dissertation, I applied computer simulations to address two important problems in invasion biology. First, in invasion biology, the impact of genetic diversity of introduced populations on their establishment success is unknown. We took an individual-based modelling approach to explore this, leveraging an ecosystem simulation called EcoSim to simulate biological invasions. We conducted reciprocal transplants of prey individuals across two simulated environments, over a gradient of genetic diversity. Our simulation results demonstrated that a harsh environment with low and spatially-varying resource abundance mediated a relationship between genetic diversity and short-term establishment success of introduced populations rather than the degree of difference between native and introduced ranges. We also found that reducing Allee effects by maintaining compactness, a measure of spatial density, was key to the establishment success of prey individuals in EcoSim, which were sexually reproducing. Further, we found evidence of a more complex relationship between genetic diversity and long-term establishment success, assuming multiple introductions were occurring. Low-diversity populations seemed to benefit more strongly from multiple introductions than high-diversity populations. Our results also corroborated the evolutionary imbalance hypothesis: the environment that yielded greater diversity produced better invaders and itself was less invasible. Finally, our study corroborated a mechanical explanation for the evolutionary imbalance hypothesis – the populations evolved in a more intense competitive environment produced better invaders. Secondly, an important advancement in invasion biology is the use of genetic barcoding or metabarcoding, in conjunction with next-generation sequencing, as a potential means of early detection of aquatic introduced species. Barcoding and metabarcoding invariably requires some amount of computational DNA sequence processing. Unfortunately, optimal processing parameters are not known in advance and the consequences of suboptimal parameter selection are poorly understood. We aimed to determine the optimal parameterization of a common sequence processing pipeline for both early detection of aquatic nonindigenous species and conducting species richness assessments. We then aimed to determine the performance of optimized pipelines in a simulated inoculation of sequences into community samples. We found that early detection requires relatively lenient processing parameters. Further, optimality depended on the research goal – what was optimal for early detection was suboptimal for estimating species richness and vice-versa. Finally, with optimal parameter selection, fewer than 11 target sequences were required in order to detect 90% of nonindigenous species

    XVoxel-Based Parametric Design Optimization of Feature Models

    Full text link
    Parametric optimization is an important product design technique, especially in the context of the modern parametric feature-based CAD paradigm. Realizing its full potential, however, requires a closed loop between CAD and CAE (i.e., CAD/CAE integration) with automatic design modifications and simulation updates. Conventionally the approach of model conversion is often employed to form the loop, but this way of working is hard to automate and requires manual inputs. As a result, the overall optimization process is too laborious to be acceptable. To address this issue, a new method for parametric optimization is introduced in this paper, based on a unified model representation scheme called eXtended Voxels (XVoxels). This scheme hybridizes feature models and voxel models into a new concept of semantic voxels, where the voxel part is responsible for FEM solving, and the semantic part is responsible for high-level information to capture both design and simulation intents. As such, it can establish a direct mapping between design models and analysis models, which in turn enables automatic updates on simulation results for design modifications, and vice versa -- effectively a closed loop between CAD and CAE. In addition, robust and efficient geometric algorithms for manipulating XVoxel models and efficient numerical methods (based on the recent finite cell method) for simulating XVoxel models are provided. The presented method has been validated by a series of case studies of increasing complexity to demonstrate its effectiveness. In particular, a computational efficiency improvement of up to 55.8 times the existing FCM method has been seen.Comment: 22 page

    Causal Discovery from Temporal Data: An Overview and New Perspectives

    Full text link
    Temporal data, representing chronological observations of complex systems, has always been a typical data structure that can be widely generated by many domains, such as industry, medicine and finance. Analyzing this type of data is extremely valuable for various applications. Thus, different temporal data analysis tasks, eg, classification, clustering and prediction, have been proposed in the past decades. Among them, causal discovery, learning the causal relations from temporal data, is considered an interesting yet critical task and has attracted much research attention. Existing casual discovery works can be divided into two highly correlated categories according to whether the temporal data is calibrated, ie, multivariate time series casual discovery, and event sequence casual discovery. However, most previous surveys are only focused on the time series casual discovery and ignore the second category. In this paper, we specify the correlation between the two categories and provide a systematical overview of existing solutions. Furthermore, we provide public datasets, evaluation metrics and new perspectives for temporal data casual discovery.Comment: 52 pages, 6 figure
    • …
    corecore