631 research outputs found
Plausible home stars of the interstellar object 'Oumuamua found in Gaia DR2
The first detected interstellar object 'Oumuamua that passed within 0.25au of
the Sun on 2017 September 9 was presumably ejected from a stellar system. We
use its newly determined non-Keplerian trajectory together with the
reconstructed Galactic orbits of 7 million stars from Gaia DR2 to identify past
close encounters. Such an "encounter" could reveal the home system from which
'Oumuamua was ejected. The closest encounter, at 0.60pc (0.53-0.67pc, 90%
confidence interval), was with the M2.5 dwarf HIP 3757 at a relative velocity
of 24.7km/s, 1Myr ago. A more distant encounter (1.6pc) but with a lower
encounter (ejection) velocity of 10.7km/s was with the G5 dwarf HD 292249,
3.8Myr ago. Two more stars have encounter distances and velocities intermediate
to these. The encounter parameters are similar across six different
non-gravitational trajectories for 'Oumuamua. Ejection of 'Oumuamua by
scattering from a giant planet in one of the systems is plausible, but requires
a rather unlikely configuration to achieve the high velocities found. A binary
star system is more likely to produce the observed velocities. None of the four
home candidates have published exoplanets or are known to be binaries. Given
that the 7 million stars in Gaia DR2 with 6D phase space information is just a
small fraction of all stars for which we can eventually reconstruct orbits, it
is a priori unlikely that our current search would find 'Oumuamua's home star
system. As 'Oumuamua is expected to pass within 1pc of about 20 stars and brown
dwarfs every Myr, the plausibility of a home system depends also on an
appropriate (low) encounter velocity.Comment: Accepted to The Astronomical Journa
Gaia: Organisation and challenges for the data processing
Gaia is an ambitious space astrometry mission of ESA with a main objective to
map the sky in astrometry and photometry down to a magnitude 20 by the end of
the next decade. While the mission is built and operated by ESA and an
industrial consortium, the data processing is entrusted to a consortium formed
by the scientific community, which was formed in 2006 and formally selected by
ESA one year later. The satellite will downlink around 100 TB of raw telemetry
data over a mission duration of 5 years from which a very complex iterative
processing will lead to the final science output: astrometry with a final
accuracy of a few tens of microarcseconds, epoch photometry in wide and narrow
bands, radial velocity and spectra for the stars brighter than 17 mag. We
discuss the general principles and main difficulties of this very large data
processing and present the organisation of the European Consortium responsible
for its design and implementation.Comment: 7 pages, 2 figures, Proceedings of IAU Symp. 24
Three-Dimensional Spectral Classification of Low-Metallicity Stars Using Artificial Neural Networks
We explore the application of artificial neural networks (ANNs) for the
estimation of atmospheric parameters (Teff, logg, and [Fe/H]) for Galactic F-
and G-type stars. The ANNs are fed with medium-resolution (~ 1-2 A) non
flux-calibrated spectroscopic observations. From a sample of 279 stars with
previous high-resolution determinations of metallicity, and a set of (external)
estimates of temperature and surface gravity, our ANNs are able to predict Teff
with an accuracy of ~ 135-150 K over the range 4250 <= Teff <= 6500 K, logg
with an accuracy of ~ 0.25-0.30 dex over the range 1.0 <= logg <= 5.0 dex, and
[Fe/H] with an accuracy ~ 0.15-0.20 dex over the range -4.0 <= [Fe/H] <= +0.3.
Such accuracies are competitive with the results obtained by fine analysis of
high-resolution spectra. It is noteworthy that the ANNs are able to obtain
these results without consideration of photometric information for these stars.
We have also explored the impact of the signal-to-noise ratio (S/N) on the
behavior of ANNs, and conclude that, when analyzed with ANNs trained on spectra
of commensurate S/N, it is possible to extract physical parameter estimates of
similar accuracy with stellar spectra having S/N as low as 13. Taken together,
these results indicate that the ANN approach should be of primary importance
for use in present and future large-scale spectroscopic surveys.Comment: 51 pages, 11 eps figures, uses aastex; to appear in Ap
CLOUDS search for variability in brown dwarf atmospheres
Context: L-type ultra-cool dwarfs and brown dwarfs have cloudy atmospheres
that could host weather-like phenomena. The detection of photometric or
spectral variability would provide insight into unresolved atmospheric
heterogeneities, such as holes in a global cloud deck.
Aims: It has been proposed that growth of heterogeneities in the global cloud
deck may account for the L- to T-type transition as brown dwarf photospheres
evolve from cloudy to clear conditions. Such a mechanism is compatible with
variability. We searched for variability in the spectra of five L6 to T6 brown
dwarfs in order to test this hypothesis.
Methods: We obtained spectroscopic time series using VLT/ISAAC, over
0.99-1.13um, and IRTF/SpeX for two of our targets, in J, H and K bands. We
search for statistically variable lines and correlation between those.
Results: High spectral-frequency variations are seen in some objects, but
these detections are marginal and need to be confirmed. We find no evidence for
large amplitude variations in spectral morphology and we place firm upper
limits of 2 to 3% on broad-band variability, on the time scale of a few hours.
The T2 transition brown dwarf SDSS J1254-0122 shows numerous variable features,
but a secure variability diagnosis would require further observations.
Conclusions: Assuming that any variability arises from the rotation of
patterns of large-scale clear and cloudy regions across the surface, we find
that the typical physical scale of cloud cover disruption should be smaller
than 5-8% of the disk area for four of our targets. The possible variations
seen in SDSS J1254-0122 are not strong enough to allow us to confirm the cloud
breaking hypothesis.Comment: 17 pages, 14 figures, accepted by A&
Gaia Data Processing Architecture
Gaia is ESA's ambitious space astrometry mission the main objective of which
is to astrometrically and spectro-photometrically map 1000 Million celestial
objects (mostly in our galaxy) with unprecedented accuracy. The announcement of
opportunity for the data processing will be issued by ESA late in 2006. The
Gaia Data Processing and Analysis Consortium (DPAC) has been formed recently
and is preparing an answer. The satellite will downlink close to 100 TB of raw
telemetry data over 5 years. To achieve its required accuracy of a few 10s of
Microarcsecond astrometry, a highly involved processing of this data is
required.
In addition to the main astrometric instrument Gaia will host a Radial
Velocity instrument, two low-resolution dispersers for multi-color photometry
and two Star Mappers. Gaia is a flying Giga Pixel camera. The various
instruments each require relatively complex processing while at the same time
being interdependent. We describe the overall composition of the DPAC and the
envisaged overall architecture of the Gaia data processing system. We shall
delve further into the core processing - one of the nine, so-called,
coordination units comprising the Gaia processing system.Comment: 10 Pages, 2 figures. To appear in ADASS XVI Proceeding
Genetic Classification of Populations using Supervised Learning
There are many instances in genetics in which we wish to determine whether
two candidate populations are distinguishable on the basis of their genetic
structure. Examples include populations which are geographically separated,
case--control studies and quality control (when participants in a study have
been genotyped at different laboratories). This latter application is of
particular importance in the era of large scale genome wide association
studies, when collections of individuals genotyped at different locations are
being merged to provide increased power. The traditional method for detecting
structure within a population is some form of exploratory technique such as
principal components analysis. Such methods, which do not utilise our prior
knowledge of the membership of the candidate populations. are termed
\emph{unsupervised}. Supervised methods, on the other hand are able to utilise
this prior knowledge when it is available.
In this paper we demonstrate that in such cases modern supervised approaches
are a more appropriate tool for detecting genetic differences between
populations. We apply two such methods, (neural networks and support vector
machines) to the classification of three populations (two from Scotland and one
from Bulgaria). The sensitivity exhibited by both these methods is considerably
higher than that attained by principal components analysis and in fact
comfortably exceeds a recently conjectured theoretical limit on the sensitivity
of unsupervised methods. In particular, our methods can distinguish between the
two Scottish populations, where principal components analysis cannot. We
suggest, on the basis of our results that a supervised learning approach should
be the method of choice when classifying individuals into pre-defined
populations, particularly in quality control for large scale genome wide
association studies.Comment: Accepted PLOS On
The Substellar Mass Function in sigma Orionis
We combine results from imaging searches for substellar objects in the sigma
Orionis cluster and follow-up photometric and spectroscopic observations to
derive a census of the brown dwarf population in a region of 847 arcmin^2. We
identify 64 very low-mass cluster member candidates in this region. We have
available three color (IZJ) photometry for all of them, spectra for 9 objects,
and K photometry for 27% of our sample. These data provide a well defined
sequence in the I vs I-J, I-K color magnitude diagrams, and indicate that the
cluster is affected by little reddening despite its young age (~5 Myr). Using
state-of-the-art evolutionary models, we derive a mass function from the
low-mass stars (0.2 Msol) across the complete brown dwarf domain (0.075 Msol to
0.013 Msol), and into the realm of free-floating planetary-mass objects (<0.013
Msol). We find that the mass spectrum (dN/dm ~ m^{-alpha}) increases toward
lower masses with an exponent alpha = 0.8+/-0.4. Our results suggest that
planetary-mass isolated objects could be as common as brown dwarfs; both kinds
of objects together would be as numerous as stars in the cluster. If the
distribution of stellar and substellar masses in sigma Orionis is
representative of the Galactic disk, older and much lower luminosity
free-floating planetary-mass objects with masses down to about 0.005 Msol
should be abundant in the solar vicinity, with a density similar to M-type
stars.Comment: Accepted for publication in ApJ. 19 pages, 3 figures include
Gaia: organisation and challenges for the data processing
Gaia is an ambitious space astrometry mission of ESA with a main objective to map the sky in astrometry and photometry down to a magnitude 20 by the end of the next decade. While the mission is built and operated by ESA and an industrial consortium, the data processing is entrusted to a consortium formed by the scientific community, which was formed in 2006 and formally selected by ESA one year later. The satellite will downlink around 100 TB of raw telemetry data over a mission duration of 5 years from which a very complex iterative processing will lead to the final science output: astrometry with a final accuracy of a few tens of microarcseconds, epoch photometry in wide and narrow bands, radial velocity and spectra for the stars brighter than 17 mag. We discuss the general principles and main difficulties of this very large data processing and present the organization of the European Consortium responsible for its design and implementatio
Spectral Classification; Old and Contemporary
Beginning with a historical account of the spectral classification, its
refinement through additional criteria is presented. The line strengths and
ratios used in two dimensional classifications of each spectral class are
described. A parallel classification scheme for metal-poor stars and the
standards used for classification are presented. The extension of spectral
classification beyond M to L and T and spectroscopic classification criteria
relevant to these classes are described. Contemporary methods of
classifications based upon different automated approaches are introduced.Comment: To be published in "Principles and Perspectives in Cosmochemistry"
Lecture Notes on Kodai School on Synthesis of Elements in Stars: Ed Aruna
Goswami & Eswar Reddy, Springer Verlag, 2009, 17 pages, 10 figure
Parametrization and Classification of 20 Billion LSST Objects: Lessons from SDSS
The Large Synoptic Survey Telescope (LSST) will be a large, wide-field
ground-based system designed to obtain, starting in 2015, multiple images of
the sky that is visible from Cerro Pachon in Northern Chile. About 90% of the
observing time will be devoted to a deep-wide-fast survey mode which will
observe a 20,000 deg region about 1000 times during the anticipated 10
years of operations (distributed over six bands, ). Each 30-second long
visit will deliver 5 depth for point sources of on average.
The co-added map will be about 3 magnitudes deeper, and will include 10 billion
galaxies and a similar number of stars. We discuss various measurements that
will be automatically performed for these 20 billion sources, and how they can
be used for classification and determination of source physical and other
properties. We provide a few classification examples based on SDSS data, such
as color classification of stars, color-spatial proximity search for wide-angle
binary stars, orbital-color classification of asteroid families, and the
recognition of main Galaxy components based on the distribution of stars in the
position-metallicity-kinematics space. Guided by these examples, we anticipate
that two grand classification challenges for LSST will be 1) rapid and robust
classification of sources detected in difference images, and 2) {\it
simultaneous} treatment of diverse astrometric and photometric time series
measurements for an unprecedentedly large number of objects.Comment: Presented at the "Classification and Discovery in Large Astronomical
Surveys" meeting, Ringberg Castle, 14-17 October, 200
- …
