84 research outputs found
Fracture formation due to differential compaction under glacial load: a poro-elastoplastic simulation of the Hugin Fracture
The Hugin Fracture, discovered in 2011, is an approximately 3.5 km long seafloor fracture in the North Sea. This fracture was unexpected and, due to the geology in the North Sea no obvious explanation could be found. In our study, we adopt the hypothesis that the Hugin fracture was formed by differential compaction controlled by glacial load. We construct a simplified 2D geomechanical model partly covered by top load (ice sheet) and test this hypothesis. We employ transient poro-elastoplastic simulation with a finite element method. For the simulations, we had to make assumptions regarding the material properties, because the fracture is located in-between well locations. We used descriptions from drilling site survey reports and literature values and performed seismic matching form well paths to the Hugin Fracture. Nearby well data were only partly useful due to incomplete logging in the first 400 m below seafloor. To overcome this problem, we introduced a mixing k-value which allows us to easily change the material properties from pure clay to sand. Changing the mixing k-value for each simulation provided information about the limits and robustness of the simulation results. Simulation results show isotropic stress and strain distribution in the horizontally layered, isotropic part of the model that is totally covered by the ice. In the central, channelized part of the model a composite stress and strain pattern develops with sub-vertical focus areas tangential to channel edges. Low stress, strain and deformation values under total load increase drastically soon after the load starts to decrease, resulting in the development of fractures along the focussed zones. Surface deformation such as formation of compaction ridges above stiff clay-filled channels and depression associated with plastic deformation is observed. A fracture and associated surface deformation develop above the shallowest sand-filled channel, very much resembling the observed geometry at the Hugin Fracture. The simulation supports the formation hypothesis for the Hugin Fracture as a compaction fracture and suggests that thin ice sheets may induce differential compaction to a depth of several hundred meters.publishedVersio
FEniCS implementation of the Virtual Fields Method (VFM) for nonhomogeneous hyperelastic identification
It is of great significance to identify the nonhomogeneous distribution of
material properties in human tissues for different clinical and medical
applications. This leads to the requirement of solving an inverse problem in
elasticity. The virtual fields method (VFM) is a rather recent inverse method
with remarkable computational efficiency compared with the optimization-based
methods. In this study, we aim to identify nonhomogeneous hyperelastic material
properties using the VFM. We propose two novel algorithms, RE-VFM and NO-VFM.
In RE-VFM, the solid is partitioned in different regions and the elastic
properties of each region are determined. In NO-VFM, 2 the distribution of
elastic properties is completely reconstructed through the inverse problem
without partitioning the solid. As the VFM requires to use virtual fields, we
proposed an efficient way to construct them and implemented the approach in the
FEniCS package. We validated the proposed methods on several examples,
including a bilayer structure, a lamina cribosa (LC) model and a cube model
embedded with a spherical inclusion. The numerical examples illustrate the
feasibility of both RE-VFM and NO-VFM. Notably, the spatial variations of the
Young's modulus distribution can be recovered accurately within only 5
iterations. The obtained results reveal the potential of the proposed methods
for future clinical applications such as estimating the risk of vision loss
related to glaucoma and detecting tumors.Comment: Advances in Software Engineering, In pres
A structure-preserving integrator for incompressible finite elastodynamics based on a grad-div stabilized mixed formulation with particular emphasis on stretch-based material models
We present a structure-preserving scheme based on a recently-proposed mixed
formulation for incompressible hyperelasticity formulated in principal
stretches. Although there exist Hamiltonians introduced for
quasi-incompressible elastodynamics based on different variational
formulations, the one in the fully incompressible regime has yet been
identified in the literature. The adopted mixed formulation naturally provides
a new Hamiltonian for fully incompressible elastodynamics. Invoking the
discrete gradient formula, we are able to design fully-discrete schemes that
preserve the Hamiltonian and momenta. The scaled mid-point formula, another
popular option for constructing algorithmic stresses, is analyzed and
demonstrated to be non-robust numerically. The generalized Taylor-Hood element
based on the spline technology conveniently provides a higher-order, robust,
and inf-sup stable spatial discretization option for finite strain analysis. To
enhance the element performance in volume conservation, the grad-div
stabilization, a technique initially developed in computational fluid dynamics,
is introduced here for elastodynamics. It is shown that the stabilization term
does not impose additional restrictions for the algorithmic stress to respect
the invariants, leading to an energy-decaying and momentum-conserving fully
discrete scheme. A set of numerical examples is provided to justify the claimed
properties. The grad-div stabilization is found to enhance the discrete mass
conservation effectively. Furthermore, in contrast to conventional algorithms
based on Cardano's formula and perturbation techniques, the spectral
decomposition algorithm developed by Scherzinger and Dohrmann is robust and
accurate to ensure the discrete conservation laws and is thus recommended for
stretch-based material modeling
Software for Exascale Computing - SPPEXA 2016-2019
This open access book summarizes the research done and results obtained in the second funding phase of the Priority Program 1648 "Software for Exascale Computing" (SPPEXA) of the German Research Foundation (DFG) presented at the SPPEXA Symposium in Dresden during October 21-23, 2019. In that respect, it both represents a continuation of Vol. 113 in Springer’s series Lecture Notes in Computational Science and Engineering, the corresponding report of SPPEXA’s first funding phase, and provides an overview of SPPEXA’s contributions towards exascale computing in today's sumpercomputer technology. The individual chapters address one or more of the research directions (1) computational algorithms, (2) system software, (3) application software, (4) data management and exploration, (5) programming, and (6) software tools. The book has an interdisciplinary appeal: scholars from computational sub-fields in computer science, mathematics, physics, or engineering will find it of particular interest
Plasma propulsion simulation using particles
This perspective paper deals with an overview of particle-in-cell / Monte
Carlo collision models applied to different plasma-propulsion configurations
and scenarios, from electrostatic (E x B and pulsed arc) devices to
electromagnetic (RF inductive, helicon, electron cyclotron resonance)
thrusters, with an emphasis on plasma plumes and their interaction with the
satellite. The most important elements related to the modeling of plasma-wall
interaction are also presented. Finally, the paper reports new progress in the
particle-in-cell computational methodology, in particular regarding
accelerating computational techniques for multi-dimensional simulations and
plasma chemistry Monte Carlo modules for molecular and alternative propellan
A new unified arc-length method for damage mechanics problems
The numerical solution of continuum damage mechanics (CDM) problems suffers
from convergence-related challenges during the material softening stage, and
consequently existing iterative solvers are subject to a trade-off between
computational expense and solution accuracy. In this work, we present a novel
unified arc-length (UAL) method, and we derive the formulation of the
analytical tangent matrix and governing system of equations for both local and
non-local gradient damage problems. Unlike existing versions of arc-length
solvers that monolithically scale the external force vector, the proposed
method treats the latter as an independent variable and determines the position
of the system on the equilibrium path based on all the nodal variations of the
external force vector. This approach renders the proposed solver substantially
more efficient and robust than existing solvers used in CDM problems. We
demonstrate the considerable advantages of the proposed algorithm through
several benchmark 1D problems with sharp snap-backs and 2D examples under
various boundary conditions and loading scenarios. The proposed UAL approach
exhibits a superior ability of overcoming critical increments along the
equilibrium path. Moreover, the proposed UAL method is 1-2 orders of magnitude
faster than force-controlled arc-length and monolithic Newton-Raphson solvers
Population-based algorithms for improved history matching and uncertainty quantification of Petroleum reservoirs
In modern field management practices, there are two important steps that shed light on a multimillion dollar investment. The first step is history matching where the simulation model is calibrated to reproduce the historical observations from the field. In this inverse problem, different geological and petrophysical properties may provide equally good history matches. Such diverse models are likely to show different production behaviors in future. This ties the history matching with the second step, uncertainty quantification of predictions. Multiple history matched models are essential for a realistic uncertainty estimate of the future field behavior. These two steps facilitate decision making and have a direct impact on technical and financial performance of oil and gas companies.
Population-based optimization algorithms have been recently enjoyed growing popularity for solving engineering problems. Population-based systems work with a group of individuals that cooperate and communicate to accomplish a task that is normally beyond the capabilities of each individual. These individuals are deployed with the aim to solve the problem with maximum efficiency.
This thesis introduces the application of two novel population-based algorithms for history matching and uncertainty quantification of petroleum reservoir models. Ant colony optimization and differential evolution algorithms are used to search the space of parameters to find multiple history matched models and, using a Bayesian framework, the posterior probability of the models are evaluated for prediction of reservoir performance.
It is demonstrated that by bringing latest developments in computer science such as ant colony, differential evolution and multiobjective optimization, we can improve the history matching and uncertainty quantification frameworks. This thesis provides insights into performance of these algorithms in history matching and prediction and develops an understanding of their tuning parameters. The research also brings a comparative study of these methods with a benchmark technique called Neighbourhood Algorithms. This comparison reveals the superiority of the proposed methodologies in various areas such as computational efficiency and match quality
Towards Interoperable Research Infrastructures for Environmental and Earth Sciences
This open access book summarises the latest developments on data management in the EU H2020 ENVRIplus project, which brought together more than 20 environmental and Earth science research infrastructures into a single community. It provides readers with a systematic overview of the common challenges faced by research infrastructures and how a ‘reference model guided’ engineering approach can be used to achieve greater interoperability among such infrastructures in the environmental and earth sciences. The 20 contributions in this book are structured in 5 parts on the design, development, deployment, operation and use of research infrastructures. Part one provides an overview of the state of the art of research infrastructure and relevant e-Infrastructure technologies, part two discusses the reference model guided engineering approach, the third part presents the software and tools developed for common data management challenges, the fourth part demonstrates the software via several use cases, and the last part discusses the sustainability and future directions
- …