14 research outputs found
A Tight Version of the Gaussian min-max theorem in the Presence of Convexity
Gaussian comparison theorems are useful tools in probability theory; they are
essential ingredients in the classical proofs of many results in empirical
processes and extreme value theory. More recently, they have been used
extensively in the analysis of underdetermined linear inverse problems. A
prominent role in the study of those problems is played by Gordon's Gaussian
min-max theorem. It has been observed that the use of the Gaussian min-max
theorem produces results that are often tight. Motivated by recent work due to
M. Stojnic, we argue explicitly that the theorem is tight under additional
convexity assumptions. To illustrate the usefulness of the result we provide an
application example from the field of noisy linear inverse problems
Bilinearly indexed random processes -- \emph{stationarization} of fully lifted interpolation
Our companion paper \cite{Stojnicnflgscompyx23} introduced a very powerful
\emph{fully lifted} (fl) statistical interpolating/comparison mechanism for
bilinearly indexed random processes. Here, we present a particular realization
of such fl mechanism that relies on a stationarization along the interpolating
path concept. A collection of very fundamental relations among the
interpolating parameters is uncovered, contextualized, and presented. As a nice
bonus, in particular special cases, we show that the introduced machinery
allows various simplifications to forms readily usable in practice. Given how
many well known random structures and optimization problems critically rely on
the results of the type considered here, the range of applications is pretty
much unlimited. We briefly point to some of these opportunities as well