33 research outputs found

    Numerical Simulations of Oscillating Soliton Stars: Excited States in Spherical Symmetry and Ground State Evolutions in 3D

    Get PDF
    Excited state soliton stars are studied numerically for the first time. The stability of spherically symmetric S-branch excited state oscillatons under radial perturbations is investigated using a 1D code. We find that these stars are inherently unstable either migrating to the ground state or collapsing to black holes. Higher excited state configurations are observed to cascade through intermediate excited states during their migration to the ground state. This is similar to excited state boson stars. Ground state oscillatons are then studied in full 3D numerical relativity. Finding the appropriate gauge condition for the dynamic oscillatons is much more challenging than in the case of boson stars. Different slicing conditions are explored, and a customized gauge condition that approximates polar slicing in spherical symmetry is implemented. Comparisons with 1D results and convergence tests are performed. The behavior of these stars under small axisymmetric perturbations is studied and gravitational waveforms are extracted. We find that the gravitational waves damp out on a short timescale, enabling us to obtain the complete waveform. This work is a starting point for the evolution of real scalar field systems with arbitrary symmetries.Comment: 12 pages, 11 figures, typos corrected, includes referee input, references corrected, published versio

    High-Level Design of a Data Carousel for the Basic Fusion Files

    Get PDF
    Sometimes data is large enough that the resources needed to merely hold the data can severely strain budgets. When resource constraints are severe, and the alternative is not having access to the data at all, an alternative is to 1) use a cheaper storage solution and 2) mitigate any problems that arise from the use of this type of storage. 3) deal with the restrictions that are present in the solution. We present a white paper based on limited prototyping, reflecting our current thinking on the high-level design and operational model using the Data Carousel Access pattern, applied in the context of Amazon Web services, for the 2.4 PB Basic Fusion Dataset.Ope

    Evolution of 3D Boson Stars with Waveform Extraction

    Full text link
    Numerical results from a study of boson stars under nonspherical perturbations using a fully general relativistic 3D code are presented together with the analysis of emitted gravitational radiation. We have constructed a simulation code suitable for the study of scalar fields in space-times of general symmetry by bringing together components for addressing the initial value problem, the full evolution system and the detection and analysis of gravitational waves. Within a series of numerical simulations, we explicitly extract the Zerilli and Newman-Penrose scalar Κ4\Psi_4 gravitational waveforms when the stars are subjected to different types of perturbations. Boson star systems have rapidly decaying nonradial quasinormal modes and thus the complete gravitational waveform could be extracted for all configurations studied. The gravitational waves emitted from stable, critical, and unstable boson star configurations are analyzed and the numerically observed quasinormal mode frequencies are compared with known linear perturbation results. The superposition of the high frequency nonspherical modes on the lower frequency spherical modes was observed in the metric oscillations when perturbations with radial and nonradial components were applied. The collapse of unstable boson stars to black holes was simulated. The apparent horizons were observed to be slightly nonspherical when initially detected and became spherical as the system evolved. The application of nonradial perturbations proportional to spherical harmonics is observed not to affect the collapse time. An unstable star subjected to a large perturbation was observed to migrate to a stable configuration.Comment: 26 pages, 12 figure

    Local v.s. AWS provisioning: Experience fusing a month’s data on AWS and local provisioning

    Get PDF
    The Terra ACCESS project provides enhanced access via fused data from all instruments on the NASA TERRA Earth science satellite. The fused data set is 2.4 PB in size and covers the period 2000 - 2015. This document is a technical report from early 2019, comparing the benefits and costs of performing the data fusion on Amazon Web Services and the Illinois campus cluster.NASA Award NNX16AM07AOpe

    The Dark Energy Survey Data Management System

    Full text link
    The Dark Energy Survey collaboration will study cosmic acceleration with a 5000 deg2 griZY survey in the southern sky over 525 nights from 2011-2016. The DES data management (DESDM) system will be used to process and archive these data and the resulting science ready data products. The DESDM system consists of an integrated archive, a processing framework, an ensemble of astronomy codes and a data access framework. We are developing the DESDM system for operation in the high performance computing (HPC) environments at NCSA and Fermilab. Operating the DESDM system in an HPC environment offers both speed and flexibility. We will employ it for our regular nightly processing needs, and for more compute-intensive tasks such as large scale image coaddition campaigns, extraction of weak lensing shear from the full survey dataset, and massive seasonal reprocessing of the DES data. Data products will be available to the Collaboration and later to the public through a virtual-observatory compatible web portal. Our approach leverages investments in publicly available HPC systems, greatly reducing hardware and maintenance costs to the project, which must deploy and maintain only the storage, database platforms and orchestration and web portal nodes that are specific to DESDM. In Fall 2007, we tested the current DESDM system on both simulated and real survey data. We used Teragrid to process 10 simulated DES nights (3TB of raw data), ingesting and calibrating approximately 250 million objects into the DES Archive database. We also used DESDM to process and calibrate over 50 nights of survey data acquired with the Mosaic2 camera. Comparison to truth tables in the case of the simulated data and internal crosschecks in the case of the real data indicate that astrometric and photometric data quality is excellent.Comment: To be published in the proceedings of the SPIE conference on Astronomical Instrumentation (held in Marseille in June 2008). This preprint is made available with the permission of SPIE. Further information together with preprint containing full quality images is available at http://desweb.cosmology.uiuc.edu/wik

    The Dark Energy Survey Data Processing and Calibration System

    Full text link
    The Dark Energy Survey (DES) is a 5000 deg2 grizY survey reaching characteristic photometric depths of 24th magnitude (10 sigma) and enabling accurate photometry and morphology of objects ten times fainter than in SDSS. Preparations for DES have included building a dedicated 3 deg2 CCD camera (DECam), upgrading the existing CTIO Blanco 4m telescope and developing a new high performance computing (HPC) enabled data management system (DESDM). The DESDM system will be used for processing, calibrating and serving the DES data. The total data volumes are high (~2PB), and so considerable effort has gone into designing an automated processing and quality control system. Special purpose image detrending and photometric calibration codes have been developed to meet the data quality requirements, while survey astrometric calibration, coaddition and cataloging rely on new extensions of the AstrOmatic codes which now include tools for PSF modeling, PSF homogenization, PSF corrected model fitting cataloging and joint model fitting across multiple input images. The DESDM system has been deployed on dedicated development clusters and HPC systems in the US and Germany. An extensive program of testing with small rapid turn-around and larger campaign simulated datasets has been carried out. The system has also been tested on large real datasets, including Blanco Cosmology Survey data from the Mosaic2 camera. In Fall 2012 the DESDM system will be used for DECam commissioning, and, thereafter, the system will go into full science operations.Comment: 12 pages, submitted for publication in SPIE Proceeding 8451-1

    LSST: from Science Drivers to Reference Design and Anticipated Data Products

    Get PDF
    (Abridged) We describe here the most ambitious survey currently planned in the optical, the Large Synoptic Survey Telescope (LSST). A vast array of science will be enabled by a single wide-deep-fast sky survey, and LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: probing dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. LSST will be a wide-field ground-based system sited at Cerro Pach\'{o}n in northern Chile. The telescope will have an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg2^2 field of view, and a 3.2 Gigapixel camera. The standard observing sequence will consist of pairs of 15-second exposures in a given field, with two such visits in each pointing in a given night. With these repeats, the LSST system is capable of imaging about 10,000 square degrees of sky in a single filter in three nights. The typical 5σ\sigma point-source depth in a single visit in rr will be ∌24.5\sim 24.5 (AB). The project is in the construction phase and will begin regular survey operations by 2022. The survey area will be contained within 30,000 deg2^2 with ÎŽ<+34.5∘\delta<+34.5^\circ, and will be imaged multiple times in six bands, ugrizyugrizy, covering the wavelength range 320--1050 nm. About 90\% of the observing time will be devoted to a deep-wide-fast survey mode which will uniformly observe a 18,000 deg2^2 region about 800 times (summed over all six bands) during the anticipated 10 years of operations, and yield a coadded map to r∌27.5r\sim27.5. The remaining 10\% of the observing time will be allocated to projects such as a Very Deep and Fast time domain survey. The goal is to make LSST data products, including a relational database of about 32 trillion observations of 40 billion objects, available to the public and scientists around the world.Comment: 57 pages, 32 color figures, version with high-resolution figures available from https://www.lsst.org/overvie
    corecore