15,050 research outputs found

    A comparison of reference-based algorithms for correcting cell-type heterogeneity in Epigenome-Wide Association Studies.

    Get PDF
    BACKGROUND: Intra-sample cellular heterogeneity presents numerous challenges to the identification of biomarkers in large Epigenome-Wide Association Studies (EWAS). While a number of reference-based deconvolution algorithms have emerged, their potential remains underexplored and a comparative evaluation of these algorithms beyond tissues such as blood is still lacking. RESULTS: Here we present a novel framework for reference-based inference, which leverages cell-type specific DNAse Hypersensitive Site (DHS) information from the NIH Epigenomics Roadmap to construct an improved reference DNA methylation database. We show that this leads to a marginal but statistically significant improvement of cell-count estimates in whole blood as well as in mixtures involving epithelial cell-types. Using this framework we compare a widely used state-of-the-art reference-based algorithm (called constrained projection) to two non-constrained approaches including CIBERSORT and a method based on robust partial correlations. We conclude that the widely-used constrained projection technique may not always be optimal. Instead, we find that the method based on robust partial correlations is generally more robust across a range of different tissue types and for realistic noise levels. We call the combined algorithm which uses DHS data and robust partial correlations for inference, EpiDISH (Epigenetic Dissection of Intra-Sample Heterogeneity). Finally, we demonstrate the added value of EpiDISH in an EWAS of smoking. CONCLUSIONS: Estimating cell-type fractions and subsequent inference in EWAS may benefit from the use of non-constrained reference-based cell-type deconvolution methods

    Parametric high resolution techniques for radio astronomical imaging

    Full text link
    The increased sensitivity of future radio telescopes will result in requirements for higher dynamic range within the image as well as better resolution and immunity to interference. In this paper we propose a new matrix formulation of the imaging equation in the cases of non co-planar arrays and polarimetric measurements. Then we improve our parametric imaging techniques in terms of resolution and estimation accuracy. This is done by enhancing both the MVDR parametric imaging, introducing alternative dirty images and by introducing better power estimates based on least squares, with positive semi-definite constraints. We also discuss the use of robust Capon beamforming and semi-definite programming for solving the self-calibration problem. Additionally we provide statistical analysis of the bias of the MVDR beamformer for the case of moving array, which serves as a first step in analyzing iterative approaches such as CLEAN and the techniques proposed in this paper. Finally we demonstrate a full deconvolution process based on the parametric imaging techniques and show its improved resolution and sensitivity compared to the CLEAN method.Comment: To appear in IEEE Journal of Selected Topics in Signal Processing, Special issue on Signal Processing for Astronomy and space research. 30 page

    Image formation in synthetic aperture radio telescopes

    Full text link
    Next generation radio telescopes will be much larger, more sensitive, have much larger observation bandwidth and will be capable of pointing multiple beams simultaneously. Obtaining the sensitivity, resolution and dynamic range supported by the receivers requires the development of new signal processing techniques for array and atmospheric calibration as well as new imaging techniques that are both more accurate and computationally efficient since data volumes will be much larger. This paper provides a tutorial overview of existing image formation techniques and outlines some of the future directions needed for information extraction from future radio telescopes. We describe the imaging process from measurement equation until deconvolution, both as a Fourier inversion problem and as an array processing estimation problem. The latter formulation enables the development of more advanced techniques based on state of the art array processing. We demonstrate the techniques on simulated and measured radio telescope data.Comment: 12 page
    • …
    corecore