25,512 research outputs found
Modeling the Black Hole Excision Problem
We analyze the excision strategy for simulating black holes. The problem is
modeled by the propagation of quasi-linear waves in a 1-dimensional spatial
region with timelike outer boundary, spacelike inner boundary and a horizon in
between. Proofs of well-posed evolution and boundary algorithms for a second
differential order treatment of the system are given for the separate pieces
underlying the finite difference problem. These are implemented in a numerical
code which gives accurate long term simulations of the quasi-linear excision
problem. Excitation of long wavelength exponential modes, which are latent in
the problem, are suppressed using conservation laws for the discretized system.
The techniques are designed to apply directly to recent codes for the Einstein
equations based upon the harmonic formulation.Comment: 21 pages, 14 postscript figures, minor contents updat
Recommended from our members
A comparison of in-sample forecasting methods
In-sample forecasting is a recent continuous modification of well-known forecasting methods based on aggregated data. These aggregated methods are known as age-cohort methods in demography, economics, epidemiology and sociology and as chain ladder in non-life insurance. Data is organized in a two-way table with age and cohort as indices, but without measures of exposure. It has recently been established that such structured forecasting methods based on aggregated data can be interpreted as structured histogram estimators. Continuous in-sample forecasting transfers these classical forecasting models into a modern statistical world including smoothing methodology that is more efficient than smoothing via histograms. All in-sample forecasting estimators are collected and their performance is compared via a finite sample simulation study. All methods are extended via multiplicative bias correction. Asymptotic theory is being developed for the histogram-type method of sieves and for the multiplicatively corrected estimators. The multiplicative bias corrected estimators improve all other known in-sample forecasters in the simulation study. The density projection approach seems to have the best performance with forecasting based on survival densities being the runner-up
Online Data Reduction for the Belle II Experiment using DATCON
The new Belle II experiment at the asymmetric accelerator SuperKEKB
at KEK in Japan is designed to deliver a peak luminosity of
. To perform high-precision track
reconstruction, e.g. for measurements of time-dependent CP-violating decays and
secondary vertices, the Belle II detector is equipped with a highly segmented
pixel detector (PXD). The high instantaneous luminosity and short bunch
crossing times result in a large stream of data in the PXD, which needs to be
significantly reduced for offline storage. The data reduction is performed
using an FPGA-based Data Acquisition Tracking and Concentrator Online Node
(DATCON), which uses information from the Belle II silicon strip vertex
detector (SVD) surrounding the PXD to carry out online track reconstruction,
extrapolation to the PXD, and Region of Interest (ROI) determination on the
PXD. The data stream is reduced by a factor of ten with an ROI finding
efficiency of >90% for PXD hits inside the ROI down to 50 MeV in
of the stable particles. We will present the current status of the
implementation of the track reconstruction using Hough transformations, and the
results obtained for simulated \Upsilon(4S) events
A new ghost cell/level set method for moving boundary problems:application to tumor growth
In this paper, we present a ghost cell/level set method for the evolution of interfaces whose normal velocity depend upon the solutions of linear and nonlinear quasi-steady reaction-diffusion equations with curvature-dependent boundary conditions. Our technique includes a ghost cell method that accurately discretizes normal derivative jump boundary conditions without smearing jumps in the tangential derivative; a new iterative method for solving linear and nonlinear quasi-steady reaction-diffusion equations; an adaptive discretization to compute the curvature and normal vectors; and a new discrete approximation to the Heaviside function. We present numerical examples that demonstrate better than 1.5-order convergence for problems where traditional ghost cell methods either fail to converge or attain at best sub-linear accuracy. We apply our techniques to a model of tumor growth in complex, heterogeneous tissues that consists of a nonlinear nutrient equation and a pressure equation with geometry-dependent jump boundary conditions. We simulate the growth of glioblastoma (an aggressive brain tumor) into a large, 1 cm square of brain tissue that includes heterogeneous nutrient delivery and varied biomechanical characteristics (white matter, gray matter, cerebrospinal fluid, and bone), and we observe growth morphologies that are highly dependent upon the variations of the tissue characteristics—an effect observed in real tumor growth
Networked PID control design : a pseudo-probabilistic robust approach
Networked Control Systems (NCS) are feedback/feed-forward control systems where control components (sensors, actuators and controllers) are distributed across a common communication network. In NCS, there exist network-induced random delays in each channel. This paper proposes a method to compensate the effects of these delays for the design and tuning of PID controllers. The control design is formulated as a constrained optimization problem and the controller stability and robustness criteria are incorporated as design constraints. The design is based on a polytopic description of the system using a Poisson pdf distribution of the delay. Simulation results are presented to demonstrate the performance of the proposed method
GEANT4 : a simulation toolkit
Abstract Geant4 is a toolkit for simulating the passage of particles through matter. It includes a complete range of functionality including tracking, geometry, physics models and hits. The physics processes offered cover a comprehensive range, including electromagnetic, hadronic and optical processes, a large set of long-lived particles, materials and elements, over a wide energy range starting, in some cases, from 250 eV and extending in others to the TeV energy range. It has been designed and constructed to expose the physics models utilised, to handle complex geometries, and to enable its easy adaptation for optimal use in different sets of applications. The toolkit is the result of a worldwide collaboration of physicists and software engineers. It has been created exploiting software engineering and object-oriented technology and implemented in the C++ programming language. It has been used in applications in particle physics, nuclear physics, accelerator design, space engineering and medical physics. PACS: 07.05.Tp; 13; 2
- …