215 research outputs found

    What is...a Curvelet?

    Get PDF
    Energized by the success of wavelets, the last two decades saw the rapid development of a new field, computational harmonic analysis, which aims to develop new systems for effectively representing phenomena of scientific interest. The curvelet transform is a recent addition to the family of mathematical tools this community enthusiastically builds up. In short, this is a new multiscale transform with strong directional character in which elements are highly anisotropic at fine scales, with effective support shaped according to the parabolic scaling principle length^2 ~ width

    Ridgelets and the representation of mutilated Sobolev functions

    Get PDF
    We show that ridgelets, a system introduced in [E. J. Candes, Appl. Comput. Harmon. Anal., 6(1999), pp. 197–218], are optimal to represent smooth multivariate functions that may exhibit linear singularities. For instance, let {u · x − b > 0} be an arbitrary hyperplane and consider the singular function f(x) = 1{u·x−b>0}g(x), where g is compactly supported with finite Sobolev L2 norm ||g||Hs, s > 0. The ridgelet coefficient sequence of such an object is as sparse as if f were without singularity, allowing optimal partial reconstructions. For instance, the n-term approximation obtained by keeping the terms corresponding to the n largest coefficients in the ridgelet series achieves a rate of approximation of order n−s/d; the presence of the singularity does not spoil the quality of the ridgelet approximation. This is unlike all systems currently in use, especially Fourier or wavelet representations

    Sparse signal and image recovery from Compressive Samples

    Get PDF
    In this paper we present an introduction to Compressive Sampling (CS), an emerging model-based framework for data acquisition and signal recovery based on the premise that a signal having a sparse representation in one basis can be reconstructed from a small number of measurements collected in a second basis that is incoherent with the first. Interestingly, a random noise-like basis will suffice for the measurement process. We will overview the basic CS theory, discuss efficient methods for signal reconstruction, and highlight applications in medical imaging

    Highly Robust Error Correction by Convex Programming

    Get PDF
    This paper discusses a stylized communications problem where one wishes to transmit a real-valued signal x ∈ ℝ^n (a block of n pieces of information) to a remote receiver. We ask whether it is possible to transmit this information reliably when a fraction of the transmitted codeword is corrupted by arbitrary gross errors, and when in addition, all the entries of the codeword are contaminated by smaller errors (e.g., quantization errors). We show that if one encodes the information as Ax where A ∈ ℝ^(m x n) (m ≥ n) is a suitable coding matrix, there are two decoding schemes that allow the recovery of the block of n pieces of information x with nearly the same accuracy as if no gross errors occurred upon transmission (or equivalently as if one had an oracle supplying perfect information about the sites and amplitudes of the gross errors). Moreover, both decoding strategies are very concrete and only involve solving simple convex optimization programs, either a linear program or a second-order cone program. We complement our study with numerical simulations showing that the encoder/decoder pair performs remarkably well

    An Introduction To Compressive Sampling [A sensing/sampling paradigm that goes against the common knowledge in data acquisition]

    Get PDF
    This article surveys the theory of compressive sampling, also known as compressed sensing or CS, a novel sensing/sampling paradigm that goes against the common wisdom in data acquisition. CS theory asserts that one can recover certain signals and images from far fewer samples or measurements than traditional methods use. To make this possible, CS relies on two principles: sparsity, which pertains to the signals of interest, and incoherence, which pertains to the sensing modality. Our intent in this article is to overview the basic CS theory that emerged in the works [1]–[3], present the key mathematical ideas underlying this theory, and survey a couple of important results in the field. Our goal is to explain CS as plainly as possible, and so our article is mainly of a tutorial nature. One of the charms of this theory is that it draws from various subdisciplines within the applied mathematical sciences, most notably probability theory. In this review, we have decided to highlight this aspect and especially the fact that randomness can — perhaps surprisingly — lead to very effective sensing mechanisms. We will also discuss significant implications, explain why CS is a concrete protocol for sensing and compressing data simultaneously (thus the name), and conclude our tour by reviewing important applications

    Gene Hunting with Knockoffs for Hidden Markov Models

    Full text link
    Modern scientific studies often require the identification of a subset of relevant explanatory variables, in the attempt to understand an interesting phenomenon. Several statistical methods have been developed to automate this task, but only recently has the framework of model-free knockoffs proposed a general solution that can perform variable selection under rigorous type-I error control, without relying on strong modeling assumptions. In this paper, we extend the methodology of model-free knockoffs to a rich family of problems where the distribution of the covariates can be described by a hidden Markov model (HMM). We develop an exact and efficient algorithm to sample knockoff copies of an HMM. We then argue that combined with the knockoffs selective framework, they provide a natural and powerful tool for performing principled inference in genome-wide association studies with guaranteed FDR control. Finally, we apply our methodology to several datasets aimed at studying the Crohn's disease and several continuous phenotypes, e.g. levels of cholesterol.Comment: 35 pages, 13 figues, 9 table
    corecore