24 research outputs found

    Approximate Inference for Constructing Astronomical Catalogs from Images

    Full text link
    We present a new, fully generative model for constructing astronomical catalogs from optical telescope image sets. Each pixel intensity is treated as a random variable with parameters that depend on the latent properties of stars and galaxies. These latent properties are themselves modeled as random. We compare two procedures for posterior inference. One procedure is based on Markov chain Monte Carlo (MCMC) while the other is based on variational inference (VI). The MCMC procedure excels at quantifying uncertainty, while the VI procedure is 1000 times faster. On a supercomputer, the VI procedure efficiently uses 665,000 CPU cores to construct an astronomical catalog from 50 terabytes of images in 14.6 minutes, demonstrating the scaling characteristics necessary to construct catalogs for upcoming astronomical surveys.Comment: accepted to the Annals of Applied Statistic

    The Data Big Bang and the Expanding Digital Universe: High-Dimensional, Complex and Massive Data Sets in an Inflationary Epoch

    Get PDF
    Recent and forthcoming advances in instrumentation, and giant new surveys, are creating astronomical data sets that are not amenable to the methods of analysis familiar to astronomers. Traditional methods are often inadequate not merely because of the size in bytes of the data sets, but also because of the complexity of modern data sets. Mathematical limitations of familiar algorithms and techniques in dealing with such data sets create a critical need for new paradigms for the representation, analysis and scientific visualization (as opposed to illustrative visualization) of heterogeneous, multiresolution data across application domains. Some of the problems presented by the new data sets have been addressed by other disciplines such as applied mathematics, statistics and machine learning and have been utilized by other sciences such as space-based geosciences. Unfortunately, valuable results pertaining to these problems are mostly to be found only in publications outside of astronomy. Here we offer brief overviews of a number of concepts, techniques and developments, some "old" and some new. These are generally unknown to most of the astronomical community, but are vital to the analysis and visualization of complex datasets and images. In order for astronomers to take advantage of the richness and complexity of the new era of data, and to be able to identify, adopt, and apply new solutions, the astronomical community needs a certain degree of awareness and understanding of the new concepts. One of the goals of this paper is to help bridge the gap between applied mathematics, artificial intelligence and computer science on the one side and astronomy on the other.Comment: 24 pages, 8 Figures, 1 Table. Accepted for publication: "Advances in Astronomy, special issue "Robotic Astronomy

    Data Mining and Machine Learning in Astronomy

    Full text link
    We review the current state of data mining and machine learning in astronomy. 'Data Mining' can have a somewhat mixed connotation from the point of view of a researcher in this field. If used correctly, it can be a powerful approach, holding the potential to fully exploit the exponentially increasing amount of available data, promising great scientific advance. However, if misused, it can be little more than the black-box application of complex computing algorithms that may give little physical insight, and provide questionable results. Here, we give an overview of the entire data mining process, from data collection through to the interpretation of results. We cover common machine learning algorithms, such as artificial neural networks and support vector machines, applications from a broad range of astronomy, emphasizing those where data mining techniques directly resulted in improved science, and important current and future directions, including probability density functions, parallel algorithms, petascale computing, and the time domain. We conclude that, so long as one carefully selects an appropriate algorithm, and is guided by the astronomical problem at hand, data mining can be very much the powerful tool, and not the questionable black box.Comment: Published in IJMPD. 61 pages, uses ws-ijmpd.cls. Several extra figures, some minor additions to the tex

    LSST Science Book, Version 2.0

    Get PDF
    A survey that can cover the sky in optical bands over wide fields to faint magnitudes with a fast cadence will enable many of the exciting science opportunities of the next decade. The Large Synoptic Survey Telescope (LSST) will have an effective aperture of 6.7 meters and an imaging camera with field of view of 9.6 deg^2, and will be devoted to a ten-year imaging survey over 20,000 deg^2 south of +15 deg. Each pointing will be imaged 2000 times with fifteen second exposures in six broad bands from 0.35 to 1.1 microns, to a total point-source depth of r~27.5. The LSST Science Book describes the basic parameters of the LSST hardware, software, and observing plans. The book discusses educational and outreach opportunities, then goes on to describe a broad range of science that LSST will revolutionize: mapping the inner and outer Solar System, stellar populations in the Milky Way and nearby galaxies, the structure of the Milky Way disk and halo and other objects in the Local Volume, transient and variable objects both at low and high redshift, and the properties of normal and active galaxies at low and high redshift. It then turns to far-field cosmological topics, exploring properties of supernovae to z~1, strong and weak lensing, the large-scale distribution of galaxies and baryon oscillations, and how these different probes may be combined to constrain cosmological models and the physics of dark energy.Comment: 596 pages. Also available at full resolution at http://www.lsst.org/lsst/sciboo

    EuCAPT White Paper: Opportunities and Challenges for Theoretical Astroparticle Physics in the Next Decade

    Get PDF
    Astroparticle physics is undergoing a profound transformation, due to a series of extraordinary new results, such as the discovery of high-energy cosmic neutrinos with IceCube, the direct detection of gravitational waves with LIGO and Virgo, and many others. This white paper is the result of a collaborative effort that involved hundreds of theoretical astroparticle physicists and cosmologists, under the coordination of the European Consortium for Astroparticle Theory (EuCAPT). Addressed to the whole astroparticle physics community, it explores upcoming theoretical opportunities and challenges for our field of research, with particular emphasis on the possible synergies among different subfields, and the prospects for solving the most fundamental open questions with multi-messenger observations.Comment: White paper of the European Consortium for Astroparticle Theory (EuCAPT). 135 authors, 400 endorsers, 133 pages, 1382 reference

    LSST Science Book, Version 2.0

    Get PDF
    A survey that can cover the sky in optical bands over wide fields to faint magnitudes with a fast cadence will enable many of the exciting science opportunities of the next decade. The Large Synoptic Survey Telescope (LSST) will have an effective aperture of 6.7 meters and an imaging camera with field of view of 9.6 deg^2, and will be devoted to a ten-year imaging survey over 20,000 deg^2 south of +15 deg. Each pointing will be imaged 2000 times with fifteen second exposures in six broad bands from 0.35 to 1.1 microns, to a total point-source depth of r~27.5. The LSST Science Book describes the basic parameters of the LSST hardware, software, and observing plans. The book discusses educational and outreach opportunities, then goes on to describe a broad range of science that LSST will revolutionize: mapping the inner and outer Solar System, stellar populations in the Milky Way and nearby galaxies, the structure of the Milky Way disk and halo and other objects in the Local Volume, transient and variable objects both at low and high redshift, and the properties of normal and active galaxies at low and high redshift. It then turns to far-field cosmological topics, exploring properties of supernovae to z~1, strong and weak lensing, the large-scale distribution of galaxies and baryon oscillations, and how these different probes may be combined to constrain cosmological models and the physics of dark energy

    Probing the nature of dark energy with 21-cm intensity mapping.

    Get PDF
    Doctoral Degree. University of KwaZulu-Natal, Durban.Two approaches to measure the BAOs (baryon acoustic oscillations) with optical and radio telescopes, namely; galaxy redshift and intensity mapping (IM) surveys have been introduced and discussed in the literature. Among the two methods, the galaxy redshift survey has been used to great effect and is based on the detection and survey of millions of individual galaxies and measuring their redshifts by comparing templates of the spectral energy distributions of the light emitted from the galaxies with optical lines. IM is novel but a robust approach that focuses on surveys of extremely large volumes of galaxies without resolving each individual galaxy and can efficiently probe scales over redshift ranges inaccessible to the current galaxy redshift surveys. However, the IM survey has promisingly shown to have better overall sensitivity to the BAOs than the galaxy redshift survey but has a number of serious issues to be quantified. The most obvious of these issues is the presence of foreground contaminants from the Milky Way galaxy and extragalactic point sources which strongly dominate the neutral hydrogen (Hi) signal of our interest. Under this study, we are interested to realize the IM approach, pave the pathway, and optimize the scientific outputs of future radio experiments. We, therefore, carry out simulations and present forecasts of the cosmological constraints by employing Hi IM technique with three near-term radio telescopes by assuming 1 year of observational time. The telescopes considered here are Five-hundred-meter Aperture Spherical radio Telescope (FAST), BAOs In Neutral Gas Observations (BINGO), and Square Kilometre Array Phase I (SKA-I) single-dish experiments. We further forecast the combined constraints of the three radio telescopes with Planck measurements. In order to tackle the foreground challenge, we develop strategies to model various sky components and employ an approach to clean them from our Milky Way galaxy and extragalactic point sources by considering a typical single-dish radio telescope. Particularly, the Principal Component Analysis foreground separation approach considered can indeed recover the cosmological Hi signal to high precision. We show that, although the approach may face some challenges, it can be fully realized on the selected range of angular scales
    corecore