Regularization plays a pivotal role when facing the challenge of solving ill-posed inverse problems, where the number of observations is smaller than the ambient dimension of the object to be estimated. A line of recent work has studied regularization models with various types of low-dimensional structures. In such settings, the general approach is to solve a regularized optimization problem, which combines a data fidelity term and some regularization penalty that promotes the assumed low-dimensional/simple structure. This paper provides a general framework to capture this low-dimensional structure through what we coin piecewise regular gauges. These are convex, non-negative, closed, bounded and positively homogenous functions that will promote objects living on low-dimensional subspaces. This class of regularizers encompasses many popular examples such as the L^1 norm, L^1-L^2 norm (group sparsity), nuclear norm, as well as several others including the L^inf norm. We will show that the set of piecewise regular gauges is closed under addition and pre-composition by a linear operator, which allows to cover mixed regularization (e.g. sparse+low-rank), and the so-called analysis-type priors (e.g. total variation, fused Lasso, trace Lasso, bounded polyhedral gauges). Our main result presents a unified sharp analysis of exact and robust recovery of the low-dimensional subspace model associated to the object to recover from partial measurements. This analysis is illustrated on a number of special and previously studied cases.no
Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.