763,559 research outputs found

    Starburst and cirrus models for submillimeter galaxies

    Full text link
    We present radiative transfer models for submillimeter galaxies with spectroscopic redshifts and mid-infrared spectroscopy from Spitzer/IRS and analyze available Spitzer/MIPS 24, 70 and 160mu data. We use two types of starburst models, a cirrus model and a model for the emission of an AGN torus in order to investigate the nature of these objects. We find that for three of the objects (25%) cirrus emission alone can account for the mid-infrared spectrum and the MIPS and submillimeter data. For the remaining objects we find that we need a combination of starburst and cirrus in order to fit simultaneously the multi--wavelength data. We find that the typical submillimeter galaxy has comparable luminosity in the starburst (median L=10^12.5 Lo) and cirrus (median L=10^12.4 Lo) components. This could arise if the galaxies have been forming stars continuously for the last 250Myr with the star formation occurring in the last 5Myr being shrouded by high-optical-depth molecular cloud dust, whereas the rest of the starlight is attenuated by diffuse dust or cirrus with an Av of about 1mag.Comment: 9 pages, AA accepte

    Moving Objects in the Hubble Ultra Deep Field

    Get PDF
    We identify proper motion objects in the Hubble Ultra Deep Field (UDF) using the optical data from the original UDF program in 2004 and the near-infrared data from the 128-orbit UDF 2012 campaign. There are 12 sources brighter than I=27 mag that display >3sigma significant proper motions. We do not find any proper motion objects fainter than this magnitude limit. Combining optical and near-infrared photometry, we model the spectral energy distribution of each point-source using stellar templates and state-of-the-art white dwarf models. For I<27 mag, we identify 23 stars with K0-M6 spectral types and two faint blue objects that are clearly old, thick disk white dwarfs. We measure a thick disk white dwarf space density of 0.1-1.7 E-3 per cubic parsec from these two objects. There are no halo white dwarfs in the UDF down to I=27 mag. Combining the Hubble Deep Field North, South, and the UDF data, we do not see any evidence for dark matter in the form of faint halo white dwarfs, and the observed population of white dwarfs can be explained with the standard Galactic models.Comment: ApJ, in pres

    Efficient Regularized Least-Squares Algorithms for Conditional Ranking on Relational Data

    Full text link
    In domains like bioinformatics, information retrieval and social network analysis, one can find learning tasks where the goal consists of inferring a ranking of objects, conditioned on a particular target object. We present a general kernel framework for learning conditional rankings from various types of relational data, where rankings can be conditioned on unseen data objects. We propose efficient algorithms for conditional ranking by optimizing squared regression and ranking loss functions. We show theoretically, that learning with the ranking loss is likely to generalize better than with the regression loss. Further, we prove that symmetry or reciprocity properties of relations can be efficiently enforced in the learned models. Experiments on synthetic and real-world data illustrate that the proposed methods deliver state-of-the-art performance in terms of predictive power and computational efficiency. Moreover, we also show empirically that incorporating symmetry or reciprocity properties can improve the generalization performance

    Model-Based Geostatistics the Easy Way

    Get PDF
    This paper briefly describes geostatistical models for Gaussian and non-Gaussian data and demonstrates the geostatsp and dieasemapping packages for performing inference using these models. Making use of R’s spatial data types, and raster objects in particular, makes spatial analyses using geostatistical models simple and convenient. Examples using real data are shown for Gaussian spatial data, binomially distributed spatial data, a logGaussian Cox process, and an area-level model for case counts

    Shape models and physical properties of asteroids

    Full text link
    Despite the large amount of high quality data generated in recent space encounters with asteroids, the majority of our knowledge about these objects comes from ground based observations. Asteroids travelling in orbits that are potentially hazardous for the Earth form an especially interesting group to be studied. In order to predict their orbital evolution, it is necessary to investigate their physical properties. This paper briefly describes the data requirements and different techniques used to solve the lightcurve inversion problem. Although photometry is the most abundant type of observational data, models of asteroids can be obtained using various data types and techniques. We describe the potential of radar imaging and stellar occultation timings to be combined with disk-integrated photometry in order to reveal information about physical properties of asteroids.Comment: From Assessment and Mitigation of Asteroid Impact Hazards boo

    Fast Autocorrelated Context Models for Data Compression

    Full text link
    A method is presented to automatically generate context models of data by calculating the data's autocorrelation function. The largest values of the autocorrelation function occur at the offsets or lags in the bitstream which tend to be the most highly correlated to any particular location. These offsets are ideal for use in predictive coding, such as predictive partial match (PPM) or context-mixing algorithms for data compression, making such algorithms more efficient and more general by reducing or eliminating the need for ad-hoc models based on particular types of data. Instead of using the definition of the autocorrelation function, which considers the pairwise correlations of data requiring O(n^2) time, the Weiner-Khinchin theorem is applied, quickly obtaining the autocorrelation as the inverse Fast Fourier transform of the data's power spectrum in O(n log n) time, making the technique practical for the compression of large data objects. The method is shown to produce the highest levels of performance obtained to date on a lossless image compression benchmark.Comment: v2 includes bibliograph
    • …
    corecore