10,522 research outputs found

    Galactic extinction and Abell clusters

    Get PDF
    In this paper, we present the results of comparing the angular distribution of Abell clusters with Galactic HI measurements. For most subsamples of clusters considered, their positions on the sky appear to be anti-correlated with respect to the distribution of HI column densities. The statistical significance of these observed anti-correlations is a function of both richness and distance class, with the more distant and/or richest systems having the highest significance (~3 sigma). The lower richness, nearby clusters appear to be randomly distributed compared to the observed Galactic HI column density.Comment: 5 pages, uuencoded compressed postscript file. Figures included. Accepted by MNRA

    Are the Earth and the Moon compositionally alike? Inferences on lunar composition and implications for lunar origin and evolution from geophysical modeling

    Get PDF
    The main objective of the present study is to discuss in detail the results obtained from an inversion of the Apollo lunar seismic data set, lunar mass, and moment of inertia. We inverted directly for lunar chemical composition and temperature using the model system CaO-FeO-MgO-Al2O3-SiO2. Using Gibbs free energy minimization, stable mineral phases at the temperatures and pressures of interest, their modes and physical properties are calculated. We determine the compositional range of the oxide elements, thermal state, Mg#, mineralogy and physical structure of the lunar interior, as well as constraining core size and density. The results indicate a lunar mantle mineralogy that is dominated by olivine and orthopyroxene ( 80 vol%), with the remainder being composed of clinopyroxene and an aluminous phase (plagioclase, spinel, and garnet present in the depth ranges 0–150 km, 150–200 km, and >200 km, respectively). This model is broadly consistent with constraints on mantle mineralogy derived from the experimental and observational study of the phase lationships and trace element compositions of lunar mare basalts and picritic glasses. In particular, by melting a typical model mantle composition using the pMELTS algorithm, we found that a range of batch melts generated from these models have features in common with low Ti mare basalts and picritic glasses. Our results also indicate a bulk lunar composition and Mg# different to that of the Earth’s upper mantle, represented by the pyrolite composition. This difference is reflected in a lower bulk lunar Mg# ( 0.83). Results also indicate a small iron-like core with a radius around 340 km.The Carlsberg Foundation, NER

    A Robust Classification of Galaxy Spectra: Dealing with Noisy and Incomplete Data

    Get PDF
    Over the next few years new spectroscopic surveys (from the optical surveys of the Sloan Digital Sky Survey and the 2 degree Field survey through to space-based ultraviolet satellites such as GALEX) will provide the opportunity and challenge of understanding how galaxies of different spectral type evolve with redshift. Techniques have been developed to classify galaxies based on their continuum and line spectra. Some of the most promising of these have used the Karhunen and Loeve transform (or Principal Component Analysis) to separate galaxies into distinct classes. Their limitation has been that they assume that the spectral coverage and quality of the spectra are constant for all galaxies within a given sample. In this paper we develop a general formalism that accounts for the missing data within the observed spectra (such as the removal of sky lines or the effect of sampling different intrinsic rest wavelength ranges due to the redshift of a galaxy). We demonstrate that by correcting for these gaps we can recover an almost redshift independent classification scheme. From this classification we can derive an optimal interpolation that reconstructs the underlying galaxy spectral energy distributions in the regions of missing data. This provides a simple and effective mechanism for building galaxy spectral energy distributions directly from data that may be noisy, incomplete or drawn from a number of different sources.Comment: 20 pages, 8 figures. Accepted for publication in A

    Image Coaddition with Temporally Varying Kernels

    Full text link
    Large, multi-frequency imaging surveys, such as the Large Synaptic Survey Telescope (LSST), need to do near-real time analysis of very large datasets. This raises a host of statistical and computational problems where standard methods do not work. In this paper, we study a proposed method for combining stacks of images into a single summary image, sometimes referred to as a template. This task is commonly referred to as image coaddition. In part, we focus on a method proposed in previous work, which outlines a procedure for combining stacks of images in an online fashion in the Fourier domain. We evaluate this method by comparing it to two straightforward methods through the use of various criteria and simulations. Note that the goal is not to propose these comparison methods for use in their own right, but to ensure that additional complexity also provides substantially improved performance

    A new source detection algorithm using FDR

    Get PDF
    The False Discovery Rate (FDR) method has recently been described by Miller et al (2001), along with several examples of astrophysical applications. FDR is a new statistical procedure due to Benjamini and Hochberg (1995) for controlling the fraction of false positives when performing multiple hypothesis testing. The importance of this method to source detection algorithms is immediately clear. To explore the possibilities offered we have developed a new task for performing source detection in radio-telescope images, Sfind 2.0, which implements FDR. We compare Sfind 2.0 with two other source detection and measurement tasks, Imsad and SExtractor, and comment on several issues arising from the nature of the correlation between nearby pixels and the necessary assumption of the null hypothesis. The strong suggestion is made that implementing FDR as a threshold defining method in other existing source-detection tasks is easy and worthwhile. We show that the constraint on the fraction of false detections as specified by FDR holds true even for highly correlated and realistic images. For the detection of true sources, which are complex combinations of source-pixels, this constraint appears to be somewhat less strict. It is still reliable enough, however, for a priori estimates of the fraction of false source detections to be robust and realistic.Comment: 17 pages, 7 figures, accepted for publication by A
    • …
    corecore