2,831 research outputs found

    Numerical Evolution of Black Holes with a Hyperbolic Formulation of General Relativity

    Get PDF
    We describe a numerical code that solves Einstein's equations for a Schwarzschild black hole in spherical symmetry, using a hyperbolic formulation introduced by Choquet-Bruhat and York. This is the first time this formulation has been used to evolve a numerical spacetime containing a black hole. We excise the hole from the computational grid in order to avoid the central singularity. We describe in detail a causal differencing method that should allow one to stably evolve a hyperbolic system of equations in three spatial dimensions with an arbitrary shift vector, to second-order accuracy in both space and time. We demonstrate the success of this method in the spherically symmetric case.Comment: 23 pages RevTeX plus 7 PostScript figures. Submitted to Phys. Rev.

    A multi-block infrastructure for three-dimensional time-dependent numerical relativity

    Get PDF
    We describe a generic infrastructure for time evolution simulations in numerical relativity using multiple grid patches. After a motivation of this approach, we discuss the relative advantages of global and patch-local tensor bases. We describe both our multi-patch infrastructure and our time evolution scheme, and comment on adaptive time integrators and parallelisation. We also describe various patch system topologies that provide spherical outer and/or multiple inner boundaries. We employ penalty inter-patch boundary conditions, and we demonstrate the stability and accuracy of our three-dimensional implementation. We solve both a scalar wave equation on a stationary rotating black hole background and the full Einstein equations. For the scalar wave equation, we compare the effects of global and patch-local tensor bases, different finite differencing operators, and the effect of artificial dissipation onto stability and accuracy. We show that multi-patch systems can directly compete with the so-called fixed mesh refinement approach; however, one can also combine both. For the Einstein equations, we show that using multiple grid patches with penalty boundary conditions leads to a robustly stable system. We also show long-term stable and accurate evolutions of a one-dimensional non-linear gauge wave. Finally, we evolve weak gravitational waves in three dimensions and extract accurate waveforms, taking advantage of the spherical shape of our grid lines.Comment: 18 pages. Some clarifications added, figure layout improve

    FixMiner: Mining Relevant Fix Patterns for Automated Program Repair

    Get PDF
    Patching is a common activity in software development. It is generally performed on a source code base to address bugs or add new functionalities. In this context, given the recurrence of bugs across projects, the associated similar patches can be leveraged to extract generic fix actions. While the literature includes various approaches leveraging similarity among patches to guide program repair, these approaches often do not yield fix patterns that are tractable and reusable as actionable input to APR systems. In this paper, we propose a systematic and automated approach to mining relevant and actionable fix patterns based on an iterative clustering strategy applied to atomic changes within patches. The goal of FixMiner is thus to infer separate and reusable fix patterns that can be leveraged in other patch generation systems. Our technique, FixMiner, leverages Rich Edit Script which is a specialized tree structure of the edit scripts that captures the AST-level context of the code changes. FixMiner uses different tree representations of Rich Edit Scripts for each round of clustering to identify similar changes. These are abstract syntax trees, edit actions trees, and code context trees. We have evaluated FixMiner on thousands of software patches collected from open source projects. Preliminary results show that we are able to mine accurate patterns, efficiently exploiting change information in Rich Edit Scripts. We further integrated the mined patterns to an automated program repair prototype, PARFixMiner, with which we are able to correctly fix 26 bugs of the Defects4J benchmark. Beyond this quantitative performance, we show that the mined fix patterns are sufficiently relevant to produce patches with a high probability of correctness: 81% of PARFixMiner's generated plausible patches are correct.Comment: 31 pages, 11 figure

    Adaptive Mesh Refinement for Characteristic Grids

    Full text link
    I consider techniques for Berger-Oliger adaptive mesh refinement (AMR) when numerically solving partial differential equations with wave-like solutions, using characteristic (double-null) grids. Such AMR algorithms are naturally recursive, and the best-known past Berger-Oliger characteristic AMR algorithm, that of Pretorius & Lehner (J. Comp. Phys. 198 (2004), 10), recurses on individual "diamond" characteristic grid cells. This leads to the use of fine-grained memory management, with individual grid cells kept in 2-dimensional linked lists at each refinement level. This complicates the implementation and adds overhead in both space and time. Here I describe a Berger-Oliger characteristic AMR algorithm which instead recurses on null \emph{slices}. This algorithm is very similar to the usual Cauchy Berger-Oliger algorithm, and uses relatively coarse-grained memory management, allowing entire null slices to be stored in contiguous arrays in memory. The algorithm is very efficient in both space and time. I describe discretizations yielding both 2nd and 4th order global accuracy. My code implementing the algorithm described here is included in the electronic supplementary materials accompanying this paper, and is freely available to other researchers under the terms of the GNU general public license.Comment: 37 pages, 15 figures (40 eps figure files, 8 of them color; all are viewable ok in black-and-white), 1 mpeg movie, uses Springer-Verlag svjour3 document class, includes C++ source code. Changes from v1: revised in response to referee comments: many references added, new figure added to better explain the algorithm, other small changes, C++ code updated to latest versio

    Grid Global Behavior Prediction

    Get PDF
    Complexity has always been one of the most important issues in distributed computing. From the first clusters to grid and now cloud computing, dealing correctly and efficiently with system complexity is the key to taking technology a step further. In this sense, global behavior modeling is an innovative methodology aimed at understanding the grid behavior. The main objective of this methodology is to synthesize the grid's vast, heterogeneous nature into a simple but powerful behavior model, represented in the form of a single, abstract entity, with a global state. Global behavior modeling has proved to be very useful in effectively managing grid complexity but, in many cases, deeper knowledge is needed. It generates a descriptive model that could be greatly improved if extended not only to explain behavior, but also to predict it. In this paper we present a prediction methodology whose objective is to define the techniques needed to create global behavior prediction models for grid systems. This global behavior prediction can benefit grid management, specially in areas such as fault tolerance or job scheduling. The paper presents experimental results obtained in real scenarios in order to validate this approach

    Automating embedded analysis capabilities and managing software complexity in multiphysics simulation part I: template-based generic programming

    Full text link
    An approach for incorporating embedded simulation and analysis capabilities in complex simulation codes through template-based generic programming is presented. This approach relies on templating and operator overloading within the C++ language to transform a given calculation into one that can compute a variety of additional quantities that are necessary for many state-of-the-art simulation and analysis algorithms. An approach for incorporating these ideas into complex simulation codes through general graph-based assembly is also presented. These ideas have been implemented within a set of packages in the Trilinos framework and are demonstrated on a simple problem from chemical engineering

    Bias Reduction of Long Memory Parameter Estimators via the Pre-filtered Sieve Bootstrap

    Full text link
    This paper investigates the use of bootstrap-based bias correction of semi-parametric estimators of the long memory parameter in fractionally integrated processes. The re-sampling method involves the application of the sieve bootstrap to data pre-filtered by a preliminary semi-parametric estimate of the long memory parameter. Theoretical justification for using the bootstrap techniques to bias adjust log-periodogram and semi-parametric local Whittle estimators of the memory parameter is provided. Simulation evidence comparing the performance of the bootstrap bias correction with analytical bias correction techniques is also presented. The bootstrap method is shown to produce notable bias reductions, in particular when applied to an estimator for which analytical adjustments have already been used. The empirical coverage of confidence intervals based on the bias-adjusted estimators is very close to the nominal, for a reasonably large sample size, more so than for the comparable analytically adjusted estimators. The precision of inferences (as measured by interval length) is also greater when the bootstrap is used to bias correct rather than analytical adjustments.Comment: 38 page

    Peru Mining: Analysis and Forecast of Mining Production in Peru Using Time Series and Data Science Techniques

    Full text link
    Peruvian mining plays a crucial role in the country's economy, being one of the main producers and exporters of minerals worldwide. In this project, an application was developed in RStudio that utilizes statistical analysis and time series modeling techniques to understand and forecast mineral extraction in different departments of Peru. The application includes an interactive map that allows users to explore Peruvian geography and obtain detailed statistics by clicking on each department. Additionally, bar charts, pie charts, and frequency polygons were implemented to visualize and analyze the data. Using the ARIMA model, predictions were made on the future extraction of minerals, enabling informed decision-making in planning and resource management within the mining sector. The application provides an interactive and accessible tool to explore the Peruvian mining industry, comprehend trends, and make accurate forecasts. These predictions for 2027 in total annual production are as follows: Copper = 2,694,957 MT, Gold = 72,817.47 kg Fine, Zinc = 1,369,649 MT, Silver = 3,083,036 MT, Lead = 255,443 MT, Iron = 15,776,609 MT, Tin = 29,542 MT, Molybdenum = 35,044.66 MT, and Cadmium = 724 MT. These predictions, based on historical data, provide valuable information for strategic decision-making and contribute to the sustainable development of the mining industry in Peru
    • …
    corecore