3,038 research outputs found
A Tight Version of the Gaussian min-max theorem in the Presence of Convexity
Gaussian comparison theorems are useful tools in probability theory; they are
essential ingredients in the classical proofs of many results in empirical
processes and extreme value theory. More recently, they have been used
extensively in the analysis of underdetermined linear inverse problems. A
prominent role in the study of those problems is played by Gordon's Gaussian
min-max theorem. It has been observed that the use of the Gaussian min-max
theorem produces results that are often tight. Motivated by recent work due to
M. Stojnic, we argue explicitly that the theorem is tight under additional
convexity assumptions. To illustrate the usefulness of the result we provide an
application example from the field of noisy linear inverse problems
Isotropically Random Orthogonal Matrices: Performance of LASSO and Minimum Conic Singular Values
Recently, the precise performance of the Generalized LASSO algorithm for
recovering structured signals from compressed noisy measurements, obtained via
i.i.d. Gaussian matrices, has been characterized. The analysis is based on a
framework introduced by Stojnic and heavily relies on the use of Gordon's
Gaussian min-max theorem (GMT), a comparison principle on Gaussian processes.
As a result, corresponding characterizations for other ensembles of measurement
matrices have not been developed. In this work, we analyze the corresponding
performance of the ensemble of isotropically random orthogonal (i.r.o.)
measurements. We consider the constrained version of the Generalized LASSO and
derive a sharp characterization of its normalized squared error in the
large-system limit. When compared to its Gaussian counterpart, our result
analytically confirms the superiority in performance of the i.r.o. ensemble.
Our second result, derives an asymptotic lower bound on the minimum conic
singular values of i.r.o. matrices. This bound is larger than the corresponding
bound on Gaussian matrices. To prove our results we express i.r.o. matrices in
terms of Gaussians and show that, with some modifications, the GMT framework is
still applicable
Differentially Private Empirical Risk Minimization with Sparsity-Inducing Norms
Differential privacy is concerned about the prediction quality while
measuring the privacy impact on individuals whose information is contained in
the data. We consider differentially private risk minimization problems with
regularizers that induce structured sparsity. These regularizers are known to
be convex but they are often non-differentiable. We analyze the standard
differentially private algorithms, such as output perturbation, Frank-Wolfe and
objective perturbation. Output perturbation is a differentially private
algorithm that is known to perform well for minimizing risks that are strongly
convex. Previous works have derived excess risk bounds that are independent of
the dimensionality. In this paper, we assume a particular class of convex but
non-smooth regularizers that induce structured sparsity and loss functions for
generalized linear models. We also consider differentially private Frank-Wolfe
algorithms to optimize the dual of the risk minimization problem. We derive
excess risk bounds for both these algorithms. Both the bounds depend on the
Gaussian width of the unit ball of the dual norm. We also show that objective
perturbation of the risk minimization problems is equivalent to the output
perturbation of a dual optimization problem. This is the first work that
analyzes the dual optimization problems of risk minimization problems in the
context of differential privacy
- …