1,078 research outputs found

    Improving Closely Spaced Dim Object Detection Through Improved Multiframe Blind Deconvolution

    Get PDF
    This dissertation focuses on improving the ability to detect dim stellar objects that are in close proximity to a bright one, through statistical image processing using short exposure images. The goal is to improve the space domain awareness capabilities with the existing infrastructure. Two new algorithms are developed. The first one is through the Neighborhood System Blind Deconvolution where the data functions are separated into the bright object, the neighborhood system, and the background functions. The second one is through the Dimension Reduction Blind Deconvolution, where the object function is represented by the product of two matrices. Both are designed to overcome the photon counting noise and the random and turbulent atmospheric conditions. The performance of the algorithms are compared with that of the Multi-Frame Blind Deconvolution. The new algorithms are tested and validated with computer generated data. The Neighborhood System Blind Deconvolution is also modified to overcome the undersampling effects since it is validated on the undersampled laboratory collected data. Even though the algorithms are designed for ground to space imaging systems, the same concept can be extended for space to space imaging. This research provides two better techniques to improve closely space dim object detection

    Data Mining and Machine Learning in Astronomy

    Full text link
    We review the current state of data mining and machine learning in astronomy. 'Data Mining' can have a somewhat mixed connotation from the point of view of a researcher in this field. If used correctly, it can be a powerful approach, holding the potential to fully exploit the exponentially increasing amount of available data, promising great scientific advance. However, if misused, it can be little more than the black-box application of complex computing algorithms that may give little physical insight, and provide questionable results. Here, we give an overview of the entire data mining process, from data collection through to the interpretation of results. We cover common machine learning algorithms, such as artificial neural networks and support vector machines, applications from a broad range of astronomy, emphasizing those where data mining techniques directly resulted in improved science, and important current and future directions, including probability density functions, parallel algorithms, petascale computing, and the time domain. We conclude that, so long as one carefully selects an appropriate algorithm, and is guided by the astronomical problem at hand, data mining can be very much the powerful tool, and not the questionable black box.Comment: Published in IJMPD. 61 pages, uses ws-ijmpd.cls. Several extra figures, some minor additions to the tex

    Constraining the Mass Profiles of Stellar Systems: Schwarzschild Modeling of Discrete Velocity Datasets

    Full text link
    (ABRIDGED) We present a new Schwarzschild orbit-superposition code designed to model discrete datasets composed of velocities of individual kinematic tracers in a dynamical system. This constitutes an extension of previous implementations that can only address continuous data in the form of (the moments of) velocity distributions, thus avoiding potentially important losses of information due to data binning. Furthermore, the code can handle any combination of available velocity components, i.e., only line-of-sight velocities, only proper motions, or a combination of both. It can also handle a combination of discrete and continuous data. The code finds the distribution function (DF, a function of the three integrals of motion E, Lz, and I3) that best reproduces the available kinematic and photometric observations in a given axisymmetric gravitational potential. The fully numerical approach ensures considerable freedom on the form of the DF f(E,Lz,I3). This allows a very general modeling of the orbital structure, thus avoiding restrictive assumptions about the degree of (an)isotropy of the orbits. We describe the implementation of the discrete code and present a series of tests of its performance based on the modeling of simulated datasets generated from a known DF. We find that the discrete Schwarzschild code recovers the original orbital structure, M/L ratios, and inclination of the input datasets to satisfactory accuracy, as quantified by various statistics. The code will be valuable, e.g., for modeling stellar motions in Galactic globular clusters, and those of individual stars, planetary nebulae, or globular clusters in nearby galaxies. This can shed new light on the total mass distributions of these systems, with central black holes and dark matter halos being of particular interest.Comment: ApJ, in press; 51 pages, 11 figures; manuscript revised following comments by refere

    Extreme deconvolution: Inferring complete distribution functions from noisy, heterogeneous and incomplete observations

    Full text link
    We generalize the well-known mixtures of Gaussians approach to density estimation and the accompanying Expectation--Maximization technique for finding the maximum likelihood parameters of the mixture to the case where each data point carries an individual dd-dimensional uncertainty covariance and has unique missing data properties. This algorithm reconstructs the error-deconvolved or "underlying" distribution function common to all samples, even when the individual data points are samples from different distributions, obtained by convolving the underlying distribution with the heteroskedastic uncertainty distribution of the data point and projecting out the missing data directions. We show how this basic algorithm can be extended with conjugate priors on all of the model parameters and a "split-and-merge" procedure designed to avoid local maxima of the likelihood. We demonstrate the full method by applying it to the problem of inferring the three-dimensional velocity distribution of stars near the Sun from noisy two-dimensional, transverse velocity measurements from the Hipparcos satellite.Comment: Published in at http://dx.doi.org/10.1214/10-AOAS439 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Stellar Activity in the Broad-Band Ultraviolet

    Get PDF
    The completion of the GALEX All-Sky Survey in the ultraviolet allows activity measurements to be acquired for many more stars than is possible with the limited sensitivity of ROSAT or the limited sky coverage of Chandra, XMM, or spectroscopic surveys for line emission in the optical or ultraviolet. We have explored the use of GALEX photometry as an activity indicator, using as a calibration sample stars within 50 pc, representing the field, and in selected nearby associations, representing the youngest stages of stellar evolution. We present preliminary relations between UV flux and the optical activity indicator R'_HK and between UV flux and age. We demonstrate that far-UV (FUV, 1350-1780{\AA}) excess flux is roughly proportional to R'_HK. We also detect a correlation between near-UV (NUV, 1780-2830{\AA}) flux and activity or age, but the effect is much more subtle, particularly for stars older than than ~0.5-1 Gyr. Both the FUV and NUV relations show large scatter, ~0.2 mag when predicting UV flux, ~0.18 dex when predicting R'_HK, and ~0.4 dex when predicting age. This scatter appears to be evenly split between observational errors in current state-of-the-art data and long-term activity variability in the sample stars.Comment: 37 pages, 12 figures. To appear in the Astronomical Journa

    Bayesian Methods for Analysis and Adaptive Scheduling of Exoplanet Observations

    Full text link
    We describe work in progress by a collaboration of astronomers and statisticians developing a suite of Bayesian data analysis tools for extrasolar planet (exoplanet) detection, planetary orbit estimation, and adaptive scheduling of observations. Our work addresses analysis of stellar reflex motion data, where a planet is detected by observing the "wobble" of its host star as it responds to the gravitational tug of the orbiting planet. Newtonian mechanics specifies an analytical model for the resulting time series, but it is strongly nonlinear, yielding complex, multimodal likelihood functions; it is even more complex when multiple planets are present. The parameter spaces range in size from few-dimensional to dozens of dimensions, depending on the number of planets in the system, and the type of motion measured (line-of-sight velocity, or position on the sky). Since orbits are periodic, Bayesian generalizations of periodogram methods facilitate the analysis. This relies on the model being linearly separable, enabling partial analytical marginalization, reducing the dimension of the parameter space. Subsequent analysis uses adaptive Markov chain Monte Carlo methods and adaptive importance sampling to perform the integrals required for both inference (planet detection and orbit measurement), and information-maximizing sequential design (for adaptive scheduling of observations). We present an overview of our current techniques and highlight directions being explored by ongoing research.Comment: 29 pages, 11 figures. An abridged version is accepted for publication in Statistical Methodology for a special issue on astrostatistics, with selected (refereed) papers presented at the Astronomical Data Analysis Conference (ADA VI) held in Monastir, Tunisia, in May 2010. Update corrects equation (3
    • …
    corecore