912 research outputs found

    Gaia: Organisation and challenges for the data processing

    Get PDF
    Gaia is an ambitious space astrometry mission of ESA with a main objective to map the sky in astrometry and photometry down to a magnitude 20 by the end of the next decade. While the mission is built and operated by ESA and an industrial consortium, the data processing is entrusted to a consortium formed by the scientific community, which was formed in 2006 and formally selected by ESA one year later. The satellite will downlink around 100 TB of raw telemetry data over a mission duration of 5 years from which a very complex iterative processing will lead to the final science output: astrometry with a final accuracy of a few tens of microarcseconds, epoch photometry in wide and narrow bands, radial velocity and spectra for the stars brighter than 17 mag. We discuss the general principles and main difficulties of this very large data processing and present the organisation of the European Consortium responsible for its design and implementation.Comment: 7 pages, 2 figures, Proceedings of IAU Symp. 24

    Plausible home stars of the interstellar object 'Oumuamua found in Gaia DR2

    Full text link
    The first detected interstellar object 'Oumuamua that passed within 0.25au of the Sun on 2017 September 9 was presumably ejected from a stellar system. We use its newly determined non-Keplerian trajectory together with the reconstructed Galactic orbits of 7 million stars from Gaia DR2 to identify past close encounters. Such an "encounter" could reveal the home system from which 'Oumuamua was ejected. The closest encounter, at 0.60pc (0.53-0.67pc, 90% confidence interval), was with the M2.5 dwarf HIP 3757 at a relative velocity of 24.7km/s, 1Myr ago. A more distant encounter (1.6pc) but with a lower encounter (ejection) velocity of 10.7km/s was with the G5 dwarf HD 292249, 3.8Myr ago. Two more stars have encounter distances and velocities intermediate to these. The encounter parameters are similar across six different non-gravitational trajectories for 'Oumuamua. Ejection of 'Oumuamua by scattering from a giant planet in one of the systems is plausible, but requires a rather unlikely configuration to achieve the high velocities found. A binary star system is more likely to produce the observed velocities. None of the four home candidates have published exoplanets or are known to be binaries. Given that the 7 million stars in Gaia DR2 with 6D phase space information is just a small fraction of all stars for which we can eventually reconstruct orbits, it is a priori unlikely that our current search would find 'Oumuamua's home star system. As 'Oumuamua is expected to pass within 1pc of about 20 stars and brown dwarfs every Myr, the plausibility of a home system depends also on an appropriate (low) encounter velocity.Comment: Accepted to The Astronomical Journa

    Detailed 3D structure of OrionA in dust with Gaia DR2

    Get PDF
    The unprecedented astrometry from Gaia DR2 provides us with an opportunity to study in detail molecular clouds in the solar neighbourhood. Extracting the wealth of information in these data remains a challenge, however. We have further improved our Gaussian Processes-based, three-dimensional dust mapping technique to allow us to study molecular clouds in more detail. These improvements include a significantly better scaling of the computational cost with the number of stars, and taking into account distance uncertainties to individual stars. Using Gaia DR2 astrometry together with 2MASS and WISE photometry for 30 000 stars, we infer the distribution of dust out to 600 pc in the direction of the Orion A molecular cloud. We identify a bubble-like structure in front of Orion A, centred at a distance of about 350 pc from the Sun. The main Orion A structure is visible at slightly larger distances, and we clearly see a tail extending over 100 pc that is curved and slightly inclined to the line-of-sight. The location of our foreground structure coincides with 5-10 Myr old stellar populations, suggesting a star formation episode that predates that of the Orion Nebula Cluster itself. We identify also the main structure of the Orion B molecular cloud, and in addition discover a background component to this at a distance of about 460 pc from the Sun. Finally, we associate our dust components at different distances with the plane-of-the-sky magnetic field orientation as mapped by Planck. This provides valuable information for modelling the magnetic field in 3D around star forming regions.Comment: Accepted for publication in Astronomy and Astrophysics. 9 pages, 12 figure

    Gaia Data Processing Architecture

    Get PDF
    Gaia is ESA's ambitious space astrometry mission the main objective of which is to astrometrically and spectro-photometrically map 1000 Million celestial objects (mostly in our galaxy) with unprecedented accuracy. The announcement of opportunity for the data processing will be issued by ESA late in 2006. The Gaia Data Processing and Analysis Consortium (DPAC) has been formed recently and is preparing an answer. The satellite will downlink close to 100 TB of raw telemetry data over 5 years. To achieve its required accuracy of a few 10s of Microarcsecond astrometry, a highly involved processing of this data is required. In addition to the main astrometric instrument Gaia will host a Radial Velocity instrument, two low-resolution dispersers for multi-color photometry and two Star Mappers. Gaia is a flying Giga Pixel camera. The various instruments each require relatively complex processing while at the same time being interdependent. We describe the overall composition of the DPAC and the envisaged overall architecture of the Gaia data processing system. We shall delve further into the core processing - one of the nine, so-called, coordination units comprising the Gaia processing system.Comment: 10 Pages, 2 figures. To appear in ADASS XVI Proceeding

    CLOUDS search for variability in brown dwarf atmospheres

    Get PDF
    Context: L-type ultra-cool dwarfs and brown dwarfs have cloudy atmospheres that could host weather-like phenomena. The detection of photometric or spectral variability would provide insight into unresolved atmospheric heterogeneities, such as holes in a global cloud deck. Aims: It has been proposed that growth of heterogeneities in the global cloud deck may account for the L- to T-type transition as brown dwarf photospheres evolve from cloudy to clear conditions. Such a mechanism is compatible with variability. We searched for variability in the spectra of five L6 to T6 brown dwarfs in order to test this hypothesis. Methods: We obtained spectroscopic time series using VLT/ISAAC, over 0.99-1.13um, and IRTF/SpeX for two of our targets, in J, H and K bands. We search for statistically variable lines and correlation between those. Results: High spectral-frequency variations are seen in some objects, but these detections are marginal and need to be confirmed. We find no evidence for large amplitude variations in spectral morphology and we place firm upper limits of 2 to 3% on broad-band variability, on the time scale of a few hours. The T2 transition brown dwarf SDSS J1254-0122 shows numerous variable features, but a secure variability diagnosis would require further observations. Conclusions: Assuming that any variability arises from the rotation of patterns of large-scale clear and cloudy regions across the surface, we find that the typical physical scale of cloud cover disruption should be smaller than 5-8% of the disk area for four of our targets. The possible variations seen in SDSS J1254-0122 are not strong enough to allow us to confirm the cloud breaking hypothesis.Comment: 17 pages, 14 figures, accepted by A&

    Three-Dimensional Spectral Classification of Low-Metallicity Stars Using Artificial Neural Networks

    Get PDF
    We explore the application of artificial neural networks (ANNs) for the estimation of atmospheric parameters (Teff, logg, and [Fe/H]) for Galactic F- and G-type stars. The ANNs are fed with medium-resolution (~ 1-2 A) non flux-calibrated spectroscopic observations. From a sample of 279 stars with previous high-resolution determinations of metallicity, and a set of (external) estimates of temperature and surface gravity, our ANNs are able to predict Teff with an accuracy of ~ 135-150 K over the range 4250 <= Teff <= 6500 K, logg with an accuracy of ~ 0.25-0.30 dex over the range 1.0 <= logg <= 5.0 dex, and [Fe/H] with an accuracy ~ 0.15-0.20 dex over the range -4.0 <= [Fe/H] <= +0.3. Such accuracies are competitive with the results obtained by fine analysis of high-resolution spectra. It is noteworthy that the ANNs are able to obtain these results without consideration of photometric information for these stars. We have also explored the impact of the signal-to-noise ratio (S/N) on the behavior of ANNs, and conclude that, when analyzed with ANNs trained on spectra of commensurate S/N, it is possible to extract physical parameter estimates of similar accuracy with stellar spectra having S/N as low as 13. Taken together, these results indicate that the ANN approach should be of primary importance for use in present and future large-scale spectroscopic surveys.Comment: 51 pages, 11 eps figures, uses aastex; to appear in Ap

    Genetic Classification of Populations using Supervised Learning

    Get PDF
    There are many instances in genetics in which we wish to determine whether two candidate populations are distinguishable on the basis of their genetic structure. Examples include populations which are geographically separated, case--control studies and quality control (when participants in a study have been genotyped at different laboratories). This latter application is of particular importance in the era of large scale genome wide association studies, when collections of individuals genotyped at different locations are being merged to provide increased power. The traditional method for detecting structure within a population is some form of exploratory technique such as principal components analysis. Such methods, which do not utilise our prior knowledge of the membership of the candidate populations. are termed \emph{unsupervised}. Supervised methods, on the other hand are able to utilise this prior knowledge when it is available. In this paper we demonstrate that in such cases modern supervised approaches are a more appropriate tool for detecting genetic differences between populations. We apply two such methods, (neural networks and support vector machines) to the classification of three populations (two from Scotland and one from Bulgaria). The sensitivity exhibited by both these methods is considerably higher than that attained by principal components analysis and in fact comfortably exceeds a recently conjectured theoretical limit on the sensitivity of unsupervised methods. In particular, our methods can distinguish between the two Scottish populations, where principal components analysis cannot. We suggest, on the basis of our results that a supervised learning approach should be the method of choice when classifying individuals into pre-defined populations, particularly in quality control for large scale genome wide association studies.Comment: Accepted PLOS On

    Deindustrialisation and the long term decline in fatal occupational injuries

    Get PDF
    Aims: To examine the extent to which deindustrialisation accounts for long term trends in occupational injury risk in the United States

    Parametrization and Classification of 20 Billion LSST Objects: Lessons from SDSS

    Get PDF
    The Large Synoptic Survey Telescope (LSST) will be a large, wide-field ground-based system designed to obtain, starting in 2015, multiple images of the sky that is visible from Cerro Pachon in Northern Chile. About 90% of the observing time will be devoted to a deep-wide-fast survey mode which will observe a 20,000 deg2^2 region about 1000 times during the anticipated 10 years of operations (distributed over six bands, ugrizyugrizy). Each 30-second long visit will deliver 5σ\sigma depth for point sources of r∌24.5r\sim24.5 on average. The co-added map will be about 3 magnitudes deeper, and will include 10 billion galaxies and a similar number of stars. We discuss various measurements that will be automatically performed for these 20 billion sources, and how they can be used for classification and determination of source physical and other properties. We provide a few classification examples based on SDSS data, such as color classification of stars, color-spatial proximity search for wide-angle binary stars, orbital-color classification of asteroid families, and the recognition of main Galaxy components based on the distribution of stars in the position-metallicity-kinematics space. Guided by these examples, we anticipate that two grand classification challenges for LSST will be 1) rapid and robust classification of sources detected in difference images, and 2) {\it simultaneous} treatment of diverse astrometric and photometric time series measurements for an unprecedentedly large number of objects.Comment: Presented at the "Classification and Discovery in Large Astronomical Surveys" meeting, Ringberg Castle, 14-17 October, 200
    • 

    corecore