99 research outputs found

    Prospects for Gaia and other space-based surveys

    Get PDF
    Gaia is a fully-approved all-sky astrometric and photometric survey due for launch in 2011. It will measure accurate parallaxes and proper motions for everything brighter than G=20 (ca. 10^9 stars). Its primary objective is to study the composition, origin and evolution of our Galaxy from the 3D structure, 3D velocities, abundances and ages of its stars. In some respects it can be considered as a cosmological survey at redshift zero. Several other upcoming space-based surveys, in particular JWST and Herschel, will study star and galaxy formation in the early (high-redshift) universe. In this paper I briefly describe these missions, as well as SIM and Jasmine, and explain why they need to observe from space. I then discuss some Galactic science contributions of Gaia concerning dark matter, the search for substructure, stellar populations and the mass--luminosity relation. The Gaia data are complex and require the development of novel analysis methods; here I summarize the principle of the astrometric processing. In the last two sections I outline how the Gaia data can be exploited in connection with other observational and theoretical work in order to build up a more comprehensive picture of galactic evolution.Comment: To appear in the proceedings of the JD13 "Exploiting large surveys for Galactic astronomy" held at the IAU GA 2006, Prague. 9 page

    A Package for the Automated Classification of Periodic Variable Stars

    Full text link
    We present a machine learning package for the classification of periodic variable stars. Our package is intended to be general: it can classify any single band optical light curve comprising at least a few tens of observations covering durations from weeks to years, with arbitrary time sampling. We use light curves of periodic variable stars taken from OGLE and EROS-2 to train the model. To make our classifier relatively survey-independent, it is trained on 16 features extracted from the light curves (e.g. period, skewness, Fourier amplitude ratio). The model classifies light curves into one of seven superclasses - Delta Scuti, RR Lyrae, Cepheid, Type II Cepheid, eclipsing binary, long-period variable, non-variable - as well as subclasses of these, such as ab, c, d, and e types for RR Lyraes. When trained to give only superclasses, our model achieves 0.98 for both recall and precision as measured on an independent validation dataset (on a scale of 0 to 1). When trained to give subclasses, it achieves 0.81 for both recall and precision. In order to assess classification performance of the subclass model, we applied it to the MACHO, LINEAR, and ASAS periodic variables, which gave recall/precision of 0.92/0.98, 0.89/0.96, and 0.84/0.88, respectively. We also applied the subclass model to Hipparcos periodic variable stars of many other variability types that do not exist in our training set, in order to examine how much those types degrade the classification performance of our target classes. In addition, we investigate how the performance varies with the number of data points and duration of observations. We find that recall and precision do not vary significantly if the number of data points is larger than 80 and the duration is more than a few weeks. The classifier software of the subclass model is available from the GitHub repository (https://goo.gl/xmFO6Q).Comment: 16 pages, 11 figures, accepted for publication in A&

    Achieving a wide field near infrared camera for the Calar Alto 3.5m telescope

    Get PDF
    The ongoing development of large infrared array detectors has enabled wide field, deep surveys to be undertaken. There are, however, a number of challenges in building an infrared instrument which has both excellent optical quality and high sensitivity over a wide field. We discuss these problems in the context of building a wide field imaging camera for the 3.5m telescope at Calar Alto with the new 2K*2K HgCdTe HAWAII-2 focal plane array. Our final design is a prime focus camera with a 15' field-of-view, called Omega 2000. To achieve excellent optical quality over the whole field, we have had to dispense with the reimaging optics and cold Lyot stop. We show that creative baffling schemes, including the use of undersized baffles, can compensate for the lost K band sensitivity. A moving baffle will be employed in Omega 2000 to allow full transmission in the non-thermal J and H bands.Comment: To appear in the SPIE proceedings of ``Optical and IR Telescope Instrumentation and Detectors'', Munich, March 200

    Assessment of stochastic and deterministic models of 6304 quasar lightcurves from SDSS Stripe 82

    Full text link
    The optical light curves of many quasars show variations of tenths of a magnitude or more on time scales of months to years. This variation often cannot be described well by a simple deterministic model. We perform a Bayesian comparison of over 20 deterministic and stochastic models on 6304 QSO light curves in SDSS Stripe 82. We include the damped random walk (or Ornstein-Uhlenbeck [OU] process), a particular type of stochastic model which recent studies have focused on. Further models we consider are single and double sinusoids, multiple OU processes, higher order continuous autoregressive processes, and composite models. We find that only 29 out of 6304 QSO lightcurves are described significantly better by a deterministic model than a stochastic one. The OU process is an adequate description of the vast majority of cases (6023). Indeed, the OU process is the best single model for 3462 light curves, with the composite OU process/sinusoid model being the best in 1706 cases. The latter model is the dominant one for brighter/bluer QSOs. Furthermore, a non-negligible fraction of QSO lightcurves show evidence that not only the mean is stochastic but the variance is stochastic, too. Our results confirm earlier work that QSO light curves can be described with a stochastic model, but place this on a firmer footing, and further show that the OU process is preferred over several other stochastic and deterministic models. Of course, there may well exist yet better (deterministic or stochastic) models which have not been considered here.Comment: accepted by AA, 12 pages, 11 figures, 4 table

    Automated Classification of Stellar Spectra. II: Two-Dimensional Classification with Neural Networks and Principal Components Analysis

    Get PDF
    We investigate the application of neural networks to the automation of MK spectral classification. The data set for this project consists of a set of over 5000 optical (3800-5200 AA) spectra obtained from objective prism plates from the Michigan Spectral Survey. These spectra, along with their two-dimensional MK classifications listed in the Michigan Henry Draper Catalogue, were used to develop supervised neural network classifiers. We show that neural networks can give accurate spectral type classifications (sig_68 = 0.82 subtypes, sig_rms = 1.09 subtypes) across the full range of spectral types present in the data set (B2-M7). We show also that the networks yield correct luminosity classes for over 95% of both dwarfs and giants with a high degree of confidence. Stellar spectra generally contain a large amount of redundant information. We investigate the application of Principal Components Analysis (PCA) to the optimal compression of spectra. We show that PCA can compress the spectra by a factor of over 30 while retaining essentially all of the useful information in the data set. Furthermore, it is shown that this compression optimally removes noise and can be used to identify unusual spectra.Comment: To appear in MNRAS. 15 pages, 17 figures, 7 tables. 2 large figures (nos. 4 and 15) are supplied as separate GIF files. The complete paper can be obtained as a single gziped PS file from http://wol.ra.phy.cam.ac.uk/calj/p1.htm

    Assessing the influence of the solar orbit on terrestrial biodiversity

    Get PDF
    The terrestrial fossil record shows a significant variation in the extinction and origination rates of species during the past half-billion years. Numerous studies have claimed an association between this variation and the motion of the Sun around the Galaxy, invoking the modulation of cosmic rays, gamma rays, and comet impact frequency as a cause of this biodiversity variation. However, some of these studies exhibit methodological problems, or were based on coarse assumptions (such as a strict periodicity of the solar orbit). Here we investigate this link in more detail, using a model of the Galaxy to reconstruct the solar orbit and thus a predictive model of the temporal variation of the extinction rate due to astronomical mechanisms. We compare these predictions as well as those of various reference models with paleontological data. Our approach involves Bayesian model comparison, which takes into account the uncertainties in the paleontological data as well as the distribution of solar orbits consistent with the uncertainties in the astronomical data. We find that various versions of the orbital model are not favored beyond simpler reference models. In particular, the distribution of mass extinction events can be explained just as well by a uniform random distribution as by any other model tested. Although our negative results on the orbital model are robust to changes in the Galaxy model, the Sun's coordinates, and the errors in the data, we also find that it would be very difficult to positively identify the orbital model even if it were the true one. (In contrast, we do find evidence against simpler periodic models.) Thus, while we cannot rule out there being some connection between solar motion and biodiversity variations on the Earth, we conclude that it is difficult to give convincing positive conclusions of such a connection using current data.Peer reviewedFinal Accepted Versio
    • …
    corecore