618 research outputs found

    A method for exploiting domain information in astrophysical parameter estimation

    Full text link
    I outline a method for estimating astrophysical parameters (APs) from multidimensional data. It is a supervised method based on matching observed data (e.g. a spectrum) to a grid of pre-labelled templates. However, unlike standard machine learning methods such as ANNs, SVMs or k-nn, this algorithm explicitly uses domain information to better weight each data dimension in the estimation. Specifically, it uses the sensitivity of each measured variable to each AP to perform a local, iterative interpolation of the grid. It avoids both the non-uniqueness problem of global regression as well as the grid resolution limitation of nearest neighbours.Comment: Proceedings of ADASS17 (September 2007, London). 4 pages. To appear in ASP Conf. Pro

    A Bayesian method for the analysis of deterministic and stochastic time series

    Full text link
    I introduce a general, Bayesian method for modelling univariate time series data assumed to be drawn from a continuous, stochastic process. The method accommodates arbitrary temporal sampling, and takes into account measurement uncertainties for arbitrary error models (not just Gaussian) on both the time and signal variables. Any model for the deterministic component of the variation of the signal with time is supported, as is any model of the stochastic component on the signal and time variables. Models illustrated here are constant and sinusoidal models for the signal mean combined with a Gaussian stochastic component, as well as a purely stochastic model, the Ornstein-Uhlenbeck process. The posterior probability distribution over model parameters is determined via Monte Carlo sampling. Models are compared using the "cross-validation likelihood", in which the posterior-averaged likelihood for different partitions of the data are combined. In principle this is more robust to changes in the prior than is the evidence (the prior-averaged likelihood). The method is demonstrated by applying it to the light curves of 11 ultra cool dwarf stars, claimed by a previous study to show statistically significant variability. This is reassessed here by calculating the cross-validation likelihood for various time series models, including a null hypothesis of no variability beyond the error bars. 10 of 11 light curves are confirmed as being significantly variable, and one of these seems to be periodic, with two plausible periods identified. Another object is best described by the Ornstein-Uhlenbeck process, a conclusion which is obviously limited to the set of models actually tested.Comment: Published in A&A as free access article. Software and additional information available from http://www.mpia.de/~calj/ctsmod.htm

    Microarcsecond astrometry with Gaia: the solar system, the Galaxy and beyond

    Full text link
    Gaia is an all sky, high precision astrometric and photometric satellite of the European Space Agency (ESA) due for launch in 2010-2011. Its primary mission is to study the composition, formation and evolution of our Galaxy. Gaia will measure parallaxes and proper motions of every object in the sky brighter than V=20, amounting to a billion stars, galaxies, quasars and solar system objects. It will achieve an astrometric accuracy of 10muas at V=15 - corresponding to a distance accuracy of 1% at 1kpc. With Gaia, tens of millions of stars will have their distances measured to a few percent or better. This is an improvement over Hipparcos by several orders of magnitude in the number of objects, accuracy and limiting magnitude. Gaia will also measure radial velocities for source brighter than V~17. To characterize the objects, each object is observed in 15 medium and broad photometric bands with an onboard CCD camera. With these capabilities, Gaia will make significant advances in a wide range of astrophysical topics. These include a detailed kinematical map of stellar populations, stellar structure and evolution, the discovery and characterization of thousands of exoplanetary systems and General Relativity on large scales. I give an overview of the mission, its operating principles and its expected scientific contributions. For the latter I provide a quick look in five areas on increasing scale size in the universe: the solar system, exosolar planets, stellar clusters and associations, Galactic structure and extragalactic astronomy.Comment: (Errors corrected) Invited paper at IAU Colloquium 196, "Transit of Venus: New Views of the Solar System and Galaxy". 14 pages, 6 figures. Version with higher resolution figures available from http://www.mpia-hd.mpg.de/homes/calj/gaia_venus2004.htm

    Limits on the infrared photometric monitoring of brown dwarfs

    Get PDF
    Recent monitoring programs of ultra cool field M and L dwarfs (low mass stars or brown dwarfs) have uncovered low amplitude photometric I-band variations which may be associated with an inhomogeneous distribution of photospheric condensates. Further evidence hints that this distribution may evolve on very short timescales, specifically of order a rotation period or less. In an attempt to study this behaviour in more detail, we have carried out a pilot program to monitor three L dwarfs in the near infrared where these objects are significantly brighter than at shorter wavelengths. We present a robust data analysis method for improving the precision and reliability of infrared photometry. No significant variability was detected in either the J or Km bands in 2M1439 and SDSS1203 above a peak-to-peak amplitude of 0.04 mag (0.08 mag for 2M1112). The main limiting factor in achieving lower detection limits is suspected to be second order extinction effects in the Earth's atmosphere, on account of the very different colours of the target and reference stars. Suggestions are given for overcoming such effects which should improve the sensitivity and reliability of infrared variability searches.Comment: MNRAS, in press (9 pages

    The ILIUM forward modelling algorithm for multivariate parameter estimation and its application to derive stellar parameters from Gaia spectrophotometry

    Full text link
    I introduce an algorithm for estimating parameters from multidimensional data based on forward modelling. In contrast to many machine learning approaches it avoids fitting an inverse model and the problems associated with this. The algorithm makes explicit use of the sensitivities of the data to the parameters, with the goal of better treating parameters which only have a weak impact on the data. The forward modelling approach provides uncertainty (full covariance) estimates in the predicted parameters as well as a goodness-of-fit for observations. I demonstrate the algorithm, ILIUM, with the estimation of stellar astrophysical parameters (APs) from simulations of the low resolution spectrophotometry to be obtained by Gaia. The AP accuracy is competitive with that obtained by a support vector machine. For example, for zero extinction stars covering a wide range of metallicity, surface gravity and temperature, ILIUM can estimate Teff to an accuracy of 0.3% at G=15 and to 4% for (lower signal-to-noise ratio) spectra at G=20. [Fe/H] and logg can be estimated to accuracies of 0.1-0.4dex for stars with G<=18.5. If extinction varies a priori over a wide range (Av=0-10mag), then Teff and Av can be estimated quite accurately (3-4% and 0.1-0.2mag respectively at G=15), but there is a strong and ubiquitous degeneracy in these parameters which limits our ability to estimate either accurately at faint magnitudes. Using the forward model we can map these degeneracies (in advance), and thus provide a complete probability distribution over solutions. (Abridged)Comment: MNRAS, in press. This revision corrects a few minor errors and typos. A better formatted version for A4 paper is available at http://www.mpia.de/home/calj/ilium.pd

    Determination of stellar parameters with GAIA

    Full text link
    The GAIA Galactic survey satellite will obtain photometry in 15 filters of over 10^9 stars in our Galaxy across a very wide range of stellar types. No other planned survey will provide so much photometric information on so many stars. I examine the problem of how to determine fundamental physical parameters (Teff, log g, [Fe/H] etc.) from these data. Given the size, multidimensionality and diversity of this dataset, this is a challenging task beyond any encountered so far in large-scale stellar parametrization. I describe the problems faced (initial object identification, interstellar extinction, multiplicity, missing data etc.) and present a framework in which they can can be addressed. A probabilistic approach is advocated on the grounds that it can take advantage of additional information (e.g. priors and data uncertainties) in a consistent and useful manner, as well as give meaningful results in the presence of poor or degenerate data. Furthermore, I suggest an approach to parametrization which can use the other information GAIA will acquire, in particular the parallax, which has not previously been available for large-scale multidimensional parametrization. Several of the problems identified and ideas suggested will be relevant to other large surveys, such as SDSS, DIVA, FAME, VISTA and LSST, as well as stellar parametrization in a virtual observatory.Comment: to appear in Astrophysics and Space Scienc
    • …
    corecore