113 research outputs found

    Classifying Periodic Astrophysical Phenomena from non-survey optimized variable-cadence observational data

    Get PDF
    Modern time-domain astronomy is capable of collecting a staggeringly large amount of data on millions of objects in real time. Therefore, the production of methods and systems for the automated classification of time-domain astronomical objects is of great importance. The Liverpool Telescope has a number of wide-field image gathering instruments mounted upon its structure, the Small Telescopes Installed at the Liverpool Telescope. These instruments have been in operation since March 2009 gathering data of large areas of sky around the current field of view of the main telescope generating a large dataset containing millions of light sources. The instruments are inexpensive to run as they do not require a separate telescope to operate but this style of surveying the sky introduces structured artifacts into our data due to the variable cadence at which sky fields are resampled. These artifacts can make light sources appear variable and must be addressed in any processing method. The data from large sky surveys can lead to the discovery of interesting new variable objects. Efficient software and analysis tools are required to rapidly determine which potentially variable objects are worthy of further telescope time. Machine learning offers a solution to the quick detection of variability by characterising the detected signals relative to previously seen exemplars. In this paper, we introduce a processing system designed for use with the Liverpool Telescope identifying potentially interesting objects through the application of a novel representation learning approach to data collected automatically from the wide-field instruments. Our method automatically produces a set of classification features by applying Principal Component Analysis on set of variable light curves using a piecewise polynomial fitted via a genetic algorithm applied to the epoch-folded data. The epoch-folding requires the selection of a candidate period for variable light curves identified using a genetic algorithm period estimation method specifically developed for this dataset. A Random Forest classifier is then used to classify the learned features to determine if a light curve is generated by an object of interest. This system allows for the telescope to automatically identify new targets through passive observations which do not affect day-to-day operations as the unique artifacts resulting from such a survey method are incorporated into the methods. We demonstrate the power of this feature extraction method compared to feature engineering performed by previous studies by training classification models on 859 light curves of 12 known variable star classes from our dataset. We show that our new features produce a model with a superior mean cross-validation F1 score of 0.4729 with a standard deviation of 0.0931 compared with the engineered features at 0.3902 with a standard deviation of 0.0619. We show that the features extracted from the representation learning are given relatively high importance in the final classification model. Additionally, we compare engineered features computed on the interpolated polynomial fits and show that they produce more reliable distributions than those fit to the raw light curve when the period estimation is correct

    Discovery, classification, and scientific exploration of transient events from the Catalina Real-time Transient Survey

    Full text link
    Exploration of the time domain - variable and transient objects and phenomena - is rapidly becoming a vibrant research frontier, touching on essentially every field of astronomy and astrophysics, from the Solar system to cosmology. Time domain astronomy is being enabled by the advent of the new generation of synoptic sky surveys that cover large areas on the sky repeatedly, and generating massive data streams. Their scientific exploration poses many challenges, driven mainly by the need for a real-time discovery, classification, and follow-up of the interesting events. Here we describe the Catalina Real-Time Transient Survey (CRTS), that discovers and publishes transient events at optical wavelengths in real time, thus benefiting the entire community. We describe some of the scientific results to date, and then focus on the challenges of the automated classification and prioritization of transient events. CRTS represents a scientific and a technological testbed and precursor for the larger surveys in the future, including the Large Synoptic Survey Telescope (LSST) and the Square Kilometer Array (SKA).Comment: 22 pages, 12 figures, invited review for the Bulletin of Astronomical Society of Indi

    Automated Real-Time Classification and Decision Making in Massive Data Streams from Synoptic Sky Surveys

    Get PDF
    The nature of scientific and technological data collection is evolving rapidly: data volumes and rates grow exponentially, with increasing complexity and information content, and there has been a transition from static data sets to data streams that must be analyzed in real time. Interesting or anomalous phenomena must be quickly characterized and followed up with additional measurements via optimal deployment of limited assets. Modern astronomy presents a variety of such phenomena in the form of transient events in digital synoptic sky surveys, including cosmic explosions (supernovae, gamma ray bursts), relativistic phenomena (black hole formation, jets), potentially hazardous asteroids, etc. We have been developing a set of machine learning tools to detect, classify and plan a response to transient events for astronomy applications, using the Catalina Real-time Transient Survey (CRTS) as a scientific and methodological testbed. The ability to respond rapidly to the potentially most interesting events is a key bottleneck that limits the scientific returns from the current and anticipated synoptic sky surveys. Similar challenge arise in other contexts, from environmental monitoring using sensor networks to autonomous spacecraft systems. Given the exponential growth of data rates, and the time-critical response, we need a fully automated and robust approach. We describe the results obtained to date, and the possible future developments.Comment: 8 pages, IEEE conference format, to appear in the refereed proceedings of the IEEE e-Science 2014 conf., eds. C. Medeiros et al., IEEE, in press (2014). arXiv admin note: substantial text overlap with arXiv:1209.1681, arXiv:1110.465

    From Data to Software to Science with the Rubin Observatory LSST

    Full text link
    The Vera C. Rubin Observatory Legacy Survey of Space and Time (LSST) dataset will dramatically alter our understanding of the Universe, from the origins of the Solar System to the nature of dark matter and dark energy. Much of this research will depend on the existence of robust, tested, and scalable algorithms, software, and services. Identifying and developing such tools ahead of time has the potential to significantly accelerate the delivery of early science from LSST. Developing these collaboratively, and making them broadly available, can enable more inclusive and equitable collaboration on LSST science. To facilitate such opportunities, a community workshop entitled "From Data to Software to Science with the Rubin Observatory LSST" was organized by the LSST Interdisciplinary Network for Collaboration and Computing (LINCC) and partners, and held at the Flatiron Institute in New York, March 28-30th 2022. The workshop included over 50 in-person attendees invited from over 300 applications. It identified seven key software areas of need: (i) scalable cross-matching and distributed joining of catalogs, (ii) robust photometric redshift determination, (iii) software for determination of selection functions, (iv) frameworks for scalable time-series analyses, (v) services for image access and reprocessing at scale, (vi) object image access (cutouts) and analysis at scale, and (vii) scalable job execution systems. This white paper summarizes the discussions of this workshop. It considers the motivating science use cases, identified cross-cutting algorithms, software, and services, their high-level technical specifications, and the principles of inclusive collaborations needed to develop them. We provide it as a useful roadmap of needs, as well as to spur action and collaboration between groups and individuals looking to develop reusable software for early LSST science.Comment: White paper from "From Data to Software to Science with the Rubin Observatory LSST" worksho

    Rubin Observatory LSST Transients and Variable Stars Roadmap

    Get PDF
    The Vera C. Rubin Legacy Survey of Space and Time holds the potential to revolutionize time domain astrophysics, reaching completely unexplored areas of the Universe and mapping variability time scales from minutes to a decade. To prepare to maximize the potential of the Rubin LSST data for the exploration of the transient and variable Universe, one of the four pillars of Rubin LSST science, the Transient and Variable Stars Science Collaboration, one of the eight Rubin LSST Science Collaborations, has identified research areas of interest and requirements, and paths to enable them. While our roadmap is ever-evolving, this document represents a snapshot of our plans and preparatory work in the final years and months leading up to the survey\u27s first light

    Astrophysical Modeling of Time-Domain Surveys

    Get PDF
    The goal of this work is to develop and apply algorithmic approaches for astrophysical modeling of time- domain surveys. Such approaches are necessary to exploit ongoing and future all-sky time-domain surveys. I focus on quantifying and characterizing source variability based on sparsely and irregularly sampled, non-simultaneous multi-band light curves, with an application to the Pan-STARRS1 (PS1) 3 pi survey: variability amplitudes and timescales are estimated via light curve structure functions. Using PS1 3 pi data on the SDSS "Stripe 82" area whose classification is available, a supervised machine-learning classifier is trained to identify QSOs and RR Lyrae based on their variability and mean colors. This leads to quite complete and pure variability-selected samples of QSO and RR Lyrae (away from the Galactic disk), that are unmatched in their combination of area, depth and fidelity. The sample entails: 4.8 x 10^4 likely RR Lyrae in the Galactic halo, and 3.7 x 10^6 likely QSO. The resulting map of RR Lyrae candidates across 3/4 of the sky reveals targets to 130 kpc, with distances precise to 3%. In particular, the sample leads to an unprecedented map of distance and width of Sagittarius stream, as traced by RR Lyrae. Furthermore, the role of PS1 3 pi as pilot survey for the upcoming LSST survey is discussed

    LSST Science Book, Version 2.0

    Get PDF
    A survey that can cover the sky in optical bands over wide fields to faint magnitudes with a fast cadence will enable many of the exciting science opportunities of the next decade. The Large Synoptic Survey Telescope (LSST) will have an effective aperture of 6.7 meters and an imaging camera with field of view of 9.6 deg^2, and will be devoted to a ten-year imaging survey over 20,000 deg^2 south of +15 deg. Each pointing will be imaged 2000 times with fifteen second exposures in six broad bands from 0.35 to 1.1 microns, to a total point-source depth of r~27.5. The LSST Science Book describes the basic parameters of the LSST hardware, software, and observing plans. The book discusses educational and outreach opportunities, then goes on to describe a broad range of science that LSST will revolutionize: mapping the inner and outer Solar System, stellar populations in the Milky Way and nearby galaxies, the structure of the Milky Way disk and halo and other objects in the Local Volume, transient and variable objects both at low and high redshift, and the properties of normal and active galaxies at low and high redshift. It then turns to far-field cosmological topics, exploring properties of supernovae to z~1, strong and weak lensing, the large-scale distribution of galaxies and baryon oscillations, and how these different probes may be combined to constrain cosmological models and the physics of dark energy.Comment: 596 pages. Also available at full resolution at http://www.lsst.org/lsst/sciboo
    corecore