28,403 research outputs found
Lagrangian Data-Driven Reduced Order Modeling of Finite Time Lyapunov Exponents
There are two main strategies for improving the projection-based reduced
order model (ROM) accuracy: (i) improving the ROM, i.e., adding new terms to
the standard ROM; and (ii) improving the ROM basis, i.e., constructing ROM
bases that yield more accurate ROMs. In this paper, we use the latter. We
propose new Lagrangian inner products that we use together with Eulerian and
Lagrangian data to construct new Lagrangian ROMs. We show that the new
Lagrangian ROMs are orders of magnitude more accurate than the standard
Eulerian ROMs, i.e., ROMs that use standard Eulerian inner product and data to
construct the ROM basis. Specifically, for the quasi-geostrophic equations, we
show that the new Lagrangian ROMs are more accurate than the standard Eulerian
ROMs in approximating not only Lagrangian fields (e.g., the finite time
Lyapunov exponent (FTLE)), but also Eulerian fields (e.g., the streamfunction).
We emphasize that the new Lagrangian ROMs do not employ any closure modeling to
model the effect of discarded modes (which is standard procedure for
low-dimensional ROMs of complex nonlinear systems). Thus, the dramatic increase
in the new Lagrangian ROMs' accuracy is entirely due to the novel Lagrangian
inner products used to build the Lagrangian ROM basis
Improvements to the APBS biomolecular solvation software suite
The Adaptive Poisson-Boltzmann Solver (APBS) software was developed to solve
the equations of continuum electrostatics for large biomolecular assemblages
that has provided impact in the study of a broad range of chemical, biological,
and biomedical applications. APBS addresses three key technology challenges for
understanding solvation and electrostatics in biomedical applications: accurate
and efficient models for biomolecular solvation and electrostatics, robust and
scalable software for applying those theories to biomolecular systems, and
mechanisms for sharing and analyzing biomolecular electrostatics data in the
scientific community. To address new research applications and advancing
computational capabilities, we have continually updated APBS and its suite of
accompanying software since its release in 2001. In this manuscript, we discuss
the models and capabilities that have recently been implemented within the APBS
software package including: a Poisson-Boltzmann analytical and a
semi-analytical solver, an optimized boundary element solver, a geometry-based
geometric flow solvation model, a graph theory based algorithm for determining
p values, and an improved web-based visualization tool for viewing
electrostatics
A Toy Model for Testing Finite Element Methods to Simulate Extreme-Mass-Ratio Binary Systems
Extreme mass ratio binary systems, binaries involving stellar mass objects
orbiting massive black holes, are considered to be a primary source of
gravitational radiation to be detected by the space-based interferometer LISA.
The numerical modelling of these binary systems is extremely challenging
because the scales involved expand over several orders of magnitude. One needs
to handle large wavelength scales comparable to the size of the massive black
hole and, at the same time, to resolve the scales in the vicinity of the small
companion where radiation reaction effects play a crucial role. Adaptive finite
element methods, in which quantitative control of errors is achieved
automatically by finite element mesh adaptivity based on posteriori error
estimation, are a natural choice that has great potential for achieving the
high level of adaptivity required in these simulations. To demonstrate this, we
present the results of simulations of a toy model, consisting of a point-like
source orbiting a black hole under the action of a scalar gravitational field.Comment: 29 pages, 37 figures. RevTeX 4.0. Minor changes to match the
published versio
An integral equation based numerical solution for nanoparticles illuminated with collimated and focused light
To address the large number of parameters involved in nanooptical problems, a more efficient computational method is necessary. An integral equation based numerical solution is developed when the particles are illuminated with collimated and focused incident beams. The solution procedure uses the method of weighted residuals, in which the integral equation is reduced to a matrix equation and then solved for the unknown electric field distribution. In the solution procedure, the effects of the surrounding medium and boundaries are taken into account using a Green’s function formulation. Therefore, there is no additional error due to artificial boundary conditions unlike differential equation based techniques, such as finite difference time domain and finite element method. In this formulation, only the scattering nano-particle is discretized. Such an approach results in a lesser number of unknowns in the resulting matrix equation. The results are compared to the analytical Mie series solution for spherical particles, as well as to the finite element method for rectangular metallic particles. The Richards-Wolf vector field equations are combined with the integral equation based formulation to model the interaction of nanoparticles with linearly and radially polarized incident focused beams
An exact general remeshing scheme applied to physically conservative voxelization
We present an exact general remeshing scheme to compute analytic integrals of
polynomial functions over the intersections between convex polyhedral cells of
old and new meshes. In physics applications this allows one to ensure global
mass, momentum, and energy conservation while applying higher-order polynomial
interpolation. We elaborate on applications of our algorithm arising in the
analysis of cosmological N-body data, computer graphics, and continuum
mechanics problems.
We focus on the particular case of remeshing tetrahedral cells onto a
Cartesian grid such that the volume integral of the polynomial density function
given on the input mesh is guaranteed to equal the corresponding integral over
the output mesh. We refer to this as "physically conservative voxelization".
At the core of our method is an algorithm for intersecting two convex
polyhedra by successively clipping one against the faces of the other. This
algorithm is an implementation of the ideas presented abstractly by Sugihara
(1994), who suggests using the planar graph representations of convex polyhedra
to ensure topological consistency of the output. This makes our implementation
robust to geometric degeneracy in the input. We employ a simplicial
decomposition to calculate moment integrals up to quadratic order over the
resulting intersection domain.
We also address practical issues arising in a software implementation,
including numerical stability in geometric calculations, management of
cancellation errors, and extension to two dimensions. In a comparison to recent
work, we show substantial performance gains. We provide a C implementation
intended to be a fast, accurate, and robust tool for geometric calculations on
polyhedral mesh elements.Comment: Code implementation available at https://github.com/devonmpowell/r3
Air pollution modelling using a graphics processing unit with CUDA
The Graphics Processing Unit (GPU) is a powerful tool for parallel computing.
In the past years the performance and capabilities of GPUs have increased, and
the Compute Unified Device Architecture (CUDA) - a parallel computing
architecture - has been developed by NVIDIA to utilize this performance in
general purpose computations. Here we show for the first time a possible
application of GPU for environmental studies serving as a basement for decision
making strategies. A stochastic Lagrangian particle model has been developed on
CUDA to estimate the transport and the transformation of the radionuclides from
a single point source during an accidental release. Our results show that
parallel implementation achieves typical acceleration values in the order of
80-120 times compared to CPU using a single-threaded implementation on a 2.33
GHz desktop computer. Only very small differences have been found between the
results obtained from GPU and CPU simulations, which are comparable with the
effect of stochastic transport phenomena in atmosphere. The relatively high
speedup with no additional costs to maintain this parallel architecture could
result in a wide usage of GPU for diversified environmental applications in the
near future.Comment: 5 figure
Developments in the simulation of compressible inviscid and viscous flow on supercomputers
In anticipation of future supercomputers, finite difference codes are rapidly being extended to simulate three-dimensional compressible flow about complex configurations. Some of these developments are reviewed. The importance of computational flow visualization and diagnostic methods to three-dimensional flow simulation is also briefly discussed
A Phase Field Model for Continuous Clustering on Vector Fields
A new method for the simplification of flow fields is presented. It is based on continuous clustering. A well-known physical clustering model, the Cahn Hilliard model, which describes phase separation, is modified to reflect the properties of the data to be visualized. Clusters are defined implicitly as connected components of the positivity set of a density function. An evolution equation for this function is obtained as a suitable gradient flow of an underlying anisotropic energy functional. Here, time serves as the scale parameter. The evolution is characterized by a successive coarsening of patterns-the actual clustering-during which the underlying simulation data specifies preferable pattern boundaries. We introduce specific physical quantities in the simulation to control the shape, orientation and distribution of the clusters as a function of the underlying flow field. In addition, the model is expanded, involving elastic effects. In the early stages of the evolution shear layer type representation of the flow field can thereby be generated, whereas, for later stages, the distribution of clusters can be influenced. Furthermore, we incorporate upwind ideas to give the clusters an oriented drop-shaped appearance. Here, we discuss the applicability of this new type of approach mainly for flow fields, where the cluster energy penalizes cross streamline boundaries. However, the method also carries provisions for other fields as well. The clusters can be displayed directly as a flow texture. Alternatively, the clusters can be visualized by iconic representations, which are positioned by using a skeletonization algorithm.
- …