3,102 research outputs found
Timing analysis of low-energy gamma ray emission from galactic compact objects using the Gamma Ray Observatory
The principal goal of our phase 1 investigation was the development of techniques and data analysis tools for pulsar searches and timing. After the launch of the Compton Observatory, we received from the Burst and Transient Source Experiment (BATSE) team one day of discriminator large area (DISCLA) data for use in the development and testing of data analysis techniques. Using this first day of data for testing and optimizing our timing tools we detected four x-ray binary pulsars, Vela X-1, Cen X-3, 4U 0115+63, and GX 301-2. Subsequently, we received four more days of data, allowing us to test our timing tools with data from a variety of days. In summary, using the tools we developed based on the first day of data that we received, we have detected 8 pulsars in 5 days of data, or roughly one quarter of the approximately 30 known x-ray binary pulsars. In addition to the pulsars listed above, we detected GX 1+4, 4U 1626-67, OAO 1657-415, and Her X-1. Many of the data analysis tools that we developed have been ported to MSFC and are being used for the analysis of BATSE data. This appendix describes some of the timing tools and presents preliminary pulse period and phase profile results
XSIL: Extensible Scientific Interchange Language
We motivate and define the XSIL language as a flexible, hierarchical, extensible transport language for scientific data objects. The entire object may be represented in the file, or there may be metadata in the XSIL file, with a powerful, fault-tolerant linking mechanism to external data. The language is based on XML, and is designed not only for parsing and processing by machines, but also for presentation to humans through web browsers and web-database technology. There is a natural mapping between the elements of the XSIL language and the object model into which they are translated by the parser. As well as common objects (Parameter, Array, Time, Table), we have extended XSIL to include the IGWDFrame, used by gravitational-wave observatories
Research in cosmic and gamma ray astrophysics
Discussed here is research in cosmic ray and gamma ray astrophysics at the Space Radiation Laboratory (SRL) of the California Institute of Technology. The primary activities discussed involve the development of new instrumentation and techniques for future space flight. In many cases these instrumentation developments were tested in balloon flight instruments designed to conduct new investigations in cosmic ray and gamma ray astrophysics. The results of these investigations are briefly summarized. Specific topics include a quantitative investigation of the solar modulation of cosmic ray protons and helium nuclei, a study of cosmic ray positron and electron spectra in interplanetary and interstellar space, the solar modulation of cosmic rays, an investigation of techniques for the measurement and interpretation of cosmic ray isotopic abundances, and a balloon measurement of the isotopic composition of galactic cosmic ray boron, carbon, and nitrogen
A Laboratory Demonstration of High-Resolution Hard X-ray and Gamma-ray Imaging using Fourier-Transform Techniques
A laboratory imaging system has been developed to study the use of Fourier-transform techniques in high-resolution hard x-ray and γ-ray imaging, with particular emphasis on possible applications to high-energy astronomy. We discuss considerations for the design of a Fourier-transform imager and describe the instrumentation used in the laboratory studies. Several analysis methods for image reconstruction are discussed including the CLEAN algorithm and maximum entropy methods. Images obtained using these methods are presented
Pulse Morphology of the Galactic Center Magnetar PSR J1745-2900
We present results from observations of the Galactic Center magnetar, PSR
J1745-2900, at 2.3 and 8.4 GHz with the NASA Deep Space Network 70 m antenna,
DSS-43. We study the magnetar's radio profile shape, flux density, radio
spectrum, and single pulse behavior over a ~1 year period between MJDs 57233
and 57621. In particular, the magnetar exhibits a significantly negative
average spectral index of = -1.86 0.02 when the
8.4 GHz profile is single-peaked, which flattens considerably when the profile
is double-peaked. We have carried out an analysis of single pulses at 8.4 GHz
on MJD 57479 and find that giant pulses and pulses with multiple emission
components are emitted during a significant number of rotations. The resulting
single pulse flux density distribution is incompatible with a log-normal
distribution. The typical pulse width of the components is ~1.8 ms, and the
prevailing delay time between successive components is ~7.7 ms. Many of the
single pulse emission components show significant frequency structure over
bandwidths of ~100 MHz, which we believe is the first observation of such
behavior from a radio magnetar. We report a characteristic single pulse
broadening timescale of = 6.9 0.2 ms at 8.4 GHz.
We find that the pulse broadening is highly variable between emission
components and cannot be explained by a thin scattering screen at distances
1 kpc. We discuss possible intrinsic and extrinsic mechanisms for the
magnetar's emission and compare our results to other magnetars, high magnetic
field pulsars, and fast radio bursts.Comment: 18 pages, 12 figures, Accepted for publication in ApJ on 2018 August
30. v2: Updated to match published versio
A novel method for transient detection in high-cadence optical surveys: Its application for a systematic search for novae in M31
[abridged] In large-scale time-domain surveys, the processing of data, from
procurement up to the detection of sources, is generally automated. One of the
main challenges is contamination by artifacts, especially in regions of strong
unresolved emission. We present a novel method for identifying candidates for
variables and transients from the outputs of such surveys' data pipelines. We
use the method to systematically search for novae in iPTF observations of the
bulge of M31. We demonstrate that most artifacts produced by the iPTF pipeline
form a locally uniform background of false detections approximately obeying
Poissonian statistics, whereas genuine variables and transients as well as
artifacts associated with bright stars result in clusters of detections, whose
spread is determined by the source localization accuracy. This makes the
problem analogous to source detection on images produced by X-ray telescopes,
enabling one to utilize tools developed in X-ray astronomy. In particular, we
use a wavelet-based source detection algorithm from the Chandra data analysis
package CIAO. Starting from ~2.5x10^5 raw detections made by the iPTF data
pipeline, we obtain ~4000 unique source candidates. Cross-matching these
candidates with the source-catalog of a deep reference image, we find
counterparts for ~90% of them. These are either artifacts due to imperfect PSF
matching or genuine variable sources. The remaining ~400 detections are
transient sources. We identify novae among these candidates by applying
selection cuts based on the expected properties of nova lightcurves. Thus, we
recovered all 12 known novae registered during the time span of the survey and
discovered three nova candidates. Our method is generic and can be applied for
mining any target out of the artifacts in optical time-domain data. As it is
fully automated, its incompleteness can be accurately computed and corrected
for.Comment: 16 pages, 8 figures, accepted to A&
The National Virtual Observatory
As a scientific discipline, Astronomy is rather unique. We only have one
laboratory, the Universe, and we cannot, of course, change the initial
conditions and study the resulting effects. On top of this, acquiring
Astronomical data has historically been a very labor-intensive effort. As a
result, data has traditionally been preserved for posterity. With recent
technological advances, however, the rate at which we acquire new data has
grown exponentially, which has generated a Data Tsunami, whose wave train
threatens to overwhelm the field. In this conference proceedings, we present
and define the concept of virtual observatories, which we feel is the only
logical answer to this dilemma.Comment: 5 pages, uses newpasp.sty (included), to appear in "Extragalactic Gas
at Low Redshfit", ASP Conf. Series, J. S. Mulchaey and J. T. Stocke (eds.
VLA Observations of Candidate Supernova Remnants from the Clark Lake 30.9 MHz Galactic Plane Survey
We report the results of 1464 MHz continuum VLA observations of eight fields containing unidentified
small-diameter objects associated with candidate supernova remnants from the Clark Lake 30.9 MHz
galactic plane survey. The observations were made in the C configuration, giving a resolution of
-12-20 arcsec, and a sensitivity of typically <0.5 mJy per beam. Polarization measurements were
made as well. One of the 30.9 MHz candidates, G41.4+ 1.2, appears to be confirmed as a supernova
remnant by our observations. Of the remaining seven fields observed, three were found to contain
small-diameter objects which met some of the criteria for nonthermal origin, but will require further
study to evaluate whether they are associated with the candidate supernova remnants. Two of the fields
were found to contain groups of unresolved objects consistent with expectations for extragalactic
background sources. In these cases the 30.9 MHz observations, which could not resolve the individual
sources but would view them as a single extended source, may have mistakenly identified them as
possible supernova remnants. Finally, two fields contained bright H II region
A Virtual Data Grid for LIGO
GriPhyN (Grid Physics Network) is a large US collaboration to
build grid services for large physics experiments, one of which is LIGO, a
gravitational-wave observatory. This paper explains the physics and computing
challenges of LIGO, and the tools that GriPhyN will build to address
them. A key component needed to implement the data pipeline is a virtual
data service; a system to dynamically create data products requested during
the various stages. The data could possibly be already processed in a certain
way, it may be in a file on a storage system, it may be cached, or it may need
to be created through computation. The full elaboration of this system will al-low
complex data pipelines to be set up as virtual data objects, with existing
data being transformed in diverse ways
- …
