22,994 research outputs found
The Impact of Systematic Edits in History Slicing
While extracting a subset of a commit history, specifying the necessary
portion is a time-consuming task for developers. Several commit-based history
slicing techniques have been proposed to identify dependencies between commits
and to extract a related set of commits using a specific commit as a slicing
criterion. However, the resulting subset of commits become large if commits for
systematic edits whose changes do not depend on each other exist. We
empirically investigated the impact of systematic edits on history slicing. In
this study, commits in which systematic edits were detected are split between
each file so that unnecessary dependencies between commits are eliminated. In
several histories of open source systems, the size of history slices was
reduced by 13.3-57.2% on average after splitting the commits for systematic
edits.Comment: 5 pages, MSR 201
Reducing regression test size by exclusion.
Operational software is constantly evolving. Regression testing is used to identify the unintended consequences of evolutionary changes. As most changes affect only a small proportion of the system, the challenge is to ensure that the regression test set is both safe (all relevant tests are used) and unclusive (only relevant tests are used). Previous approaches to reducing test sets struggle to find safe and inclusive tests by looking only at the changed code. We use decomposition program slicing to safely reduce the size of regression test sets by identifying those parts of a system that could not have been affected by a change; this information will then direct the selection of regression tests by eliminating tests that are not relevant to the change. The technique properly accounts for additions and deletions of code.
We extend and use Rothermel and Harrold’s framework for measuring the safety of regression test sets and introduce new safety and precision measures that do not require a priori knowledge of the exact number
of modification-revealing tests. We then analytically evaluate and compare our techniques for producing reduced regression test sets
Reducing regression test size by exclusion.
Operational software is constantly evolving. Regression testing is used to identify the unintended consequences of evolutionary changes. As most changes affect only a small proportion of the system, the challenge is to ensure that the regression test set is both safe (all relevant tests are used) and unclusive (only relevant tests are used). Previous approaches to reducing test sets struggle to find safe and inclusive tests by looking only at the changed code. We use decomposition program slicing to safely reduce the size of regression test sets by identifying those parts of a system that could not have been affected by a change; this information will then direct the selection of regression tests by eliminating tests that are not relevant to the change. The technique properly accounts for additions and deletions of code.
We extend and use Rothermel and Harrold’s framework for measuring the safety of regression test sets and introduce new safety and precision measures that do not require a priori knowledge of the exact number
of modification-revealing tests. We then analytically evaluate and compare our techniques for producing reduced regression test sets
Pathologies of hyperbolic gauges in general relativity and other field theories
We present a mathematical characterization of hyperbolic gauge pathologies in
general relativity and electrodynamics. We show how non-linear gauge terms can
produce a blow-up along characteristics and how this can be identified
numerically by performing convergence analysis. Finally, we show some numerical
examples and discuss the profound implications this may have for the field of
numerical relativity.Comment: 5 pages, includes 2 figs. To appear in Phys.Rev.D Rapid Com
Numerical relativity for D dimensional axially symmetric space-times: formalism and code tests
The numerical evolution of Einstein's field equations in a generic background
has the potential to answer a variety of important questions in physics: from
applications to the gauge-gravity duality, to modelling black hole production
in TeV gravity scenarios, analysis of the stability of exact solutions and
tests of Cosmic Censorship. In order to investigate these questions, we extend
numerical relativity to more general space-times than those investigated
hitherto, by developing a framework to study the numerical evolution of D
dimensional vacuum space-times with an SO(D-2) isometry group for D\ge 5, or
SO(D-3) for D\ge 6.
Performing a dimensional reduction on a (D-4)-sphere, the D dimensional
vacuum Einstein equations are rewritten as a 3+1 dimensional system with source
terms, and presented in the Baumgarte, Shapiro, Shibata and Nakamura (BSSN)
formulation. This allows the use of existing 3+1 dimensional numerical codes
with small adaptations. Brill-Lindquist initial data are constructed in D
dimensions and a procedure to match them to our 3+1 dimensional evolution
equations is given. We have implemented our framework by adapting the LEAN code
and perform a variety of simulations of non-spinning black hole space-times.
Specifically, we present a modified moving puncture gauge which facilitates
long term stable simulations in D=5. We further demonstrate the internal
consistency of the code by studying convergence and comparing numerical versus
analytic results in the case of geodesic slicing for D=5,6.Comment: 31 pages, 6 figures; v2 Minor changes and added two references.
Matches the published version in PRD
Hyperbolic slicings of spacetime: singularity avoidance and gauge shocks
I study the Bona-Masso family of hyperbolic slicing conditions, considering
in particular its properties when approaching two different types of
singularities: focusing singularities and gauge shocks. For focusing
singularities, I extend the original analysis of Bona et. al and show that both
marginal and strong singularity avoidance can be obtained for certain types of
behavior of the slicing condition as the lapse approaches zero. For the case of
gauge shocks, I re-derive a condition found previously that eliminates them.
Unfortunately, such a condition limits considerably the type of slicings
allowed. However, useful slicing conditions can still be found if one asks for
this condition to be satisfied only approximately. Such less restrictive
conditions include a particular member of the 1+log family, which in the past
has been found empirically to be extremely robust for both Brill wave and black
hole simulations.Comment: 11 pages, revtex4. Change in acknowledgment
Towards a Realistic Neutron Star Binary Inspiral: Initial Data and Multiple Orbit Evolution in Full General Relativity
This paper reports on our effort in modeling realistic astrophysical neutron
star binaries in general relativity. We analyze under what conditions the
conformally flat quasiequilibrium (CFQE) approach can generate
``astrophysically relevant'' initial data, by developing an analysis that
determines the violation of the CFQE approximation in the evolution of the
binary described by the full Einstein theory. We show that the CFQE assumptions
significantly violate the Einstein field equations for corotating neutron stars
at orbital separations nearly double that of the innermost stable circular
orbit (ISCO) separation, thus calling into question the astrophysical relevance
of the ISCO determined in the CFQE approach. With the need to start numerical
simulations at large orbital separation in mind, we push for stable and long
term integrations of the full Einstein equations for the binary neutron star
system. We demonstrate the stability of our numerical treatment and analyze the
stringent requirements on resolution and size of the computational domain for
an accurate simulation of the system.Comment: 22 pages, 18 figures, accepted to Phys. Rev.
- …