10 research outputs found

    Splines Are Universal Solutions of Linear Inverse Problems with Generalized TV Regularization

    Get PDF
    Splines come in a variety of flavors that can be characterized in terms of some differential operator L. The simplest piecewise-constant model corresponds to the derivative operator. Likewise, one can extend the traditional notion of total variation by considering more general operators than the derivative. This results in the definitions of a generalized total variation seminorm and its corresponding native space, which is further identified as the direct sum of two Banach spaces. We then prove that the minimization of the generalized total variation (gTV), subject to some arbitrary (convex) consistency constraints on the linear measurements of the signal, admits nonuniform L-spline solutions with fewer knots than the number of measurements. This shows that nonuniform splines are universal solutions of continuous-domain linear inverse problems with LASSO, L1 L _{ 1 } , or total-variationlike regularization constraints. Remarkably, the type of spline is fully determined by the choice of L and does not depend on the actual nature of the measurements

    Interpretation of Continuous-Time Autoregressive Processes as Random Exponential Splines

    No full text
    We consider the class of continuous-time autoregressive (CAR) processes driven by (possibly non-Gaussian) Lévy white noises. When the excitation is an impulsive noise, also known as compound Poisson noise, the associated CAR process is a random non-uniform exponential spline. Therefore, Poisson-type processes are relatively easy to understand in the sense that they have a finite rate of innovation. We show in this paper that any CAR process is the limit in distribution of a sequence of CAR processes driven by impulsive noises. Hence, we provide a new interpretation of general CAR processes as limits of random exponential splines. We illustrate our result with simulations

    Compressibility of Symmetric-α-Stable Processes

    No full text
    Within a deterministic framework, it is well known that n-term wavelet approximation rates of functions can be deduced from their Besov regularity. We use this principle to determine approximation rates for symmetric-α-stable (SαS) stochastic processes. First, we characterize the Besov regularity of SαS processes. Then the n-term approximation rates follow. To capture the local smoothness behavior, we consider sparse processes defined on the circle that are solutions of stochastic differential equations

    Statistics of Wavelet Coefficients for Sparse Self-Similar Images

    No full text
    We study the statistics of wavelet coefficients of non-Gaussian images, focusing mainly on the behaviour at coarse scales. We assume that an image can be whitened by a fractional Laplacian operator, which is consistent with an ωγ ∥ω∥ ^{ -\gamma } spectral decay. In other words, we model images as sparse and self-similar stochastic processes within the framework of generalised innovation models. We show that the wavelet coefficients at coarse scales are asymptotically Gaussian even if the prior model for fine scales is sparse. We further refine our analysis by deriving the theoretical evolution of the cumulants of wavelet coefficients across scales. Especially, the evolution of the kurtosis supplies a theoretical prediction for the Gaussianity level at each scale. Finally, we provide simulations and experiments that support our theoretical predictions

    Functional estimation of anisotropic covariance and autocovariance operators on the sphere

    No full text
    We propose nonparametric estimators for the second-order cen-tral moments of possibly anisotropic spherical random fields, within a func-tional data analysis context. We consider a measurement framework where each random field among an identically distributed collection of spherical random fields is sampled at a few random directions, possibly subject to measurement error. The collection of random fields could be i.i.d. or se-rially dependent. Though similar setups have already been explored for random functions defined on the unit interval, the nonparametric estima-tors proposed in the literature often rely on local polynomials, which do not readily extend to the (product) spherical setting. We therefore formulate our estimation procedure as a variational problem involving a generalized Tikhonov regularization term. The latter favours smooth covariance/auto-covariance functions, where the smoothness is specified by means of suit-able Sobolev-like pseudo-differential operators. Using the machinery of re-producing kernel Hilbert spaces, we establish representer theorems that fully characterize the form of our estimators. We determine their uniform rates of convergence as the number of random fields diverges, both for the dense (increasing number of spatial samples) and sparse (bounded number of spatial samples) regimes. We moreover demonstrate the computational feasibility and practical merits of our estimation procedure in a simulation setting, assuming a fixed number of samples per random field. Our numeri-cal estimation procedure leverages the sparsity and second-order Kronecker structure of our setup to reduce the computational and memory require-ments by approximately three orders of magnitude compared to a naive implementation would require

    MAP Estimators for Self-Similar Sparse Stochastic Models

    No full text
    We consider the reconstruction of multi-dimensional signals from noisy samples. The problem is formulated within the framework of the theory of continuous-domain sparse stochastic processes. In particular, we study the fractional Laplacian as the whitening operator specifying the correlation structure of the model. We then derive a class of MAP estimators where the priors are confined to the family of infinitely divisible distributions. Finally, we provide simulations where the derived estimators are compared against total-variation (TV) denoising

    Local rotation invariance in 3D CNNs.

    No full text
    Locally Rotation Invariant (LRI) image analysis was shown to be fundamental in many applications and in particular in medical imaging where local structures of tissues occur at arbitrary rotations. LRI constituted the cornerstone of several breakthroughs in texture analysis, including Local Binary Patterns (LBP), Maximum Response 8 (MR8) and steerable filterbanks. Whereas globally rotation invariant Convolutional Neural Networks (CNN) were recently proposed, LRI was very little investigated in the context of deep learning. LRI designs allow learning filters accounting for all orientations, which enables a drastic reduction of trainable parameters and training data when compared to standard 3D CNNs. In this paper, we propose and compare several methods to obtain LRI CNNs with directional sensitivity. Two methods use orientation channels (responses to rotated kernels), either by explicitly rotating the kernels or using steerable filters. These orientation channels constitute a locally rotation equivariant representation of the data. Local pooling across orientations yields LRI image analysis. Steerable filters are used to achieve a fine and efficient sampling of 3D rotations as well as a reduction of trainable parameters and operations, thanks to a parametric representations involving solid Spherical Harmonics (SH),which are products of SH with associated learned radial profiles. Finally, we investigate a third strategy to obtain LRI based on rotational invariants calculated from responses to a learned set of solid SHs. The proposed methods are evaluated and compared to standard CNNs on 3D datasets including synthetic textured volumes composed of rotated patterns, and pulmonary nodule classification in CT. The results show the importance of LRI image analysis while resulting in a drastic reduction of trainable parameters, outperforming standard 3D CNNs trained with rotational data augmentation
    corecore