93,534 research outputs found

    Online Active Linear Regression via Thresholding

    Full text link
    We consider the problem of online active learning to collect data for regression modeling. Specifically, we consider a decision maker with a limited experimentation budget who must efficiently learn an underlying linear population model. Our main contribution is a novel threshold-based algorithm for selection of most informative observations; we characterize its performance and fundamental lower bounds. We extend the algorithm and its guarantees to sparse linear regression in high-dimensional settings. Simulations suggest the algorithm is remarkably robust: it provides significant benefits over passive random sampling in real-world datasets that exhibit high nonlinearity and high dimensionality --- significantly reducing both the mean and variance of the squared error.Comment: Published in AAAI 201

    Sparse Signal Recovery under Poisson Statistics

    Full text link
    We are motivated by problems that arise in a number of applications such as Online Marketing and explosives detection, where the observations are usually modeled using Poisson statistics. We model each observation as a Poisson random variable whose mean is a sparse linear superposition of known patterns. Unlike many conventional problems observations here are not identically distributed since they are associated with different sensing modalities. We analyze the performance of a Maximum Likelihood (ML) decoder, which for our Poisson setting involves a non-linear optimization but yet is computationally tractable. We derive fundamental sample complexity bounds for sparse recovery when the measurements are contaminated with Poisson noise. In contrast to the least-squares linear regression setting with Gaussian noise, we observe that in addition to sparsity, the scale of the parameters also fundamentally impacts sample complexity. We introduce a novel notion of Restricted Likelihood Perturbation (RLP), to jointly account for scale and sparsity. We derive sample complexity bounds for â„“1\ell_1 regularized ML estimators in terms of RLP and further specialize these results for deterministic and random sensing matrix designs.Comment: 13 pages, 11 figures, 2 tables, submitted to IEEE Transactions on Signal Processin

    Inferring epigenetic and transcriptional regulation during blood cell development with a mixture of sparse linear models

    Get PDF
    Motivation: Blood cell development is thought to be controlled by a circuit of transcription factors (TFs) and chromatin modifications that determine the cell fate through activating cell type-specific expression programs. To shed light on the interplay between histone marks and TFs during blood cell development, we model gene expression from regulatory signals by means of combinations of sparse linear regression models. Results: The mixture of sparse linear regression models was able to improve the gene expression prediction in relation to the use of a single linear model. Moreover, it performed an efficient selection of regulatory signals even when analyzing all TFs with known motifs (>600). The method identified interesting roles for histone modifications and a selection of TFs related to blood development and chromatin remodelling. Availability: The method and datasets are available from http://www.cin.ufpe.br/~igcf/SparseMix. Contact: [email protected] Supplementary information:Supplementary data are available at Bioinformatics online

    Applications of Conjugate Gradient in Bayesian computation

    Full text link
    Conjugate gradient is an efficient algorithm for solving large sparse linear systems. It has been utilized to accelerate the computation in Bayesian analysis for many large-scale problems. This article discusses the applications of conjugate gradient in Bayesian computation, with a focus on sparse regression and spatial analysis. A self-contained introduction of conjugate gradient is provided to facilitate potential applications in a broader range of problems.Comment: 7 pages. In Wiley StatsRef: Statistics Reference Online (2023). This paper was originally published on Wiley StatsRef: Statistics Reference Online on December 15 2022. The reason for reuploading it on arXiv is to enhance its visibility and accessibilit
    • …
    corecore