7,455 research outputs found
Noise stability is computable and approximately low-dimensional
Questions of noise stability play an important role in hardness of approximation in computer science as well as in the theory of voting. In many applications, the goal is to find an optimizer of noise stability among all possible partitions of R[superscript n] for n ≥ 1 to k parts with given Gaussian measures μ[superscript 1], . . . , μ[superscript k]. We call a partition ϵ-optimal, if its noise stability is optimal up to an additive ϵ. In this paper, we give an explicit, computable function n(ϵ) such that an ϵ-optimal partition exists in R[superscript n(ϵ)]. This result has implications for the computability of certain problems in non-interactive simulation, which are addressed in a subsequent work. Keywords: Gaussian noise stability; Plurality is stablest; Ornstein Uhlenbeck operatorNational Science Foundation (U.S.) (Award CCF 1320105)United States. Office of Naval Research (Grant N00014-16-1-2227
A Survey on Continuous Time Computations
We provide an overview of theories of continuous time computation. These
theories allow us to understand both the hardness of questions related to
continuous time dynamical systems and the computational power of continuous
time analog models. We survey the existing models, summarizing results, and
point to relevant references in the literature
How Quantum Computers Fail: Quantum Codes, Correlations in Physical Systems, and Noise Accumulation
The feasibility of computationally superior quantum computers is one of the
most exciting and clear-cut scientific questions of our time. The question
touches on fundamental issues regarding probability, physics, and
computability, as well as on exciting problems in experimental physics,
engineering, computer science, and mathematics. We propose three related
directions towards a negative answer. The first is a conjecture about physical
realizations of quantum codes, the second has to do with correlations in
stochastic physical systems, and the third proposes a model for quantum
evolutions when noise accumulates. The paper is dedicated to the memory of
Itamar Pitowsky.Comment: 16 page
Non interactive simulation of correlated distributions is decidable
A basic problem in information theory is the following: Let be an arbitrary distribution where the marginals
and are (potentially) correlated. Let Alice and Bob
be two players where Alice gets samples and Bob gets
samples and for all , . What
joint distributions can be simulated by Alice and Bob without any
interaction?
Classical works in information theory by G{\'a}cs-K{\"o}rner and Wyner answer
this question when at least one of or is the
distribution on where each marginal is unbiased and
identical. However, other than this special case, the answer to this question
is understood in very few cases. Recently, Ghazi, Kamath and Sudan showed that
this problem is decidable for supported on . We extend their result to supported on any finite
alphabet.
We rely on recent results in Gaussian geometry (by the authors) as well as a
new \emph{smoothing argument} inspired by the method of \emph{boosting} from
learning theory and potential function arguments from complexity theory and
additive combinatorics.Comment: The reduction for non-interactive simulation for general source
distribution to the Gaussian case was incorrect in the previous version. It
has been rectified no
Robust Geometry Estimation using the Generalized Voronoi Covariance Measure
The Voronoi Covariance Measure of a compact set K of R^d is a tensor-valued
measure that encodes geometric information on K and which is known to be
resilient to Hausdorff noise but sensitive to outliers. In this article, we
generalize this notion to any distance-like function delta and define the
delta-VCM. We show that the delta-VCM is resilient to Hausdorff noise and to
outliers, thus providing a tool to estimate robustly normals from a point cloud
approximation. We present experiments showing the robustness of our approach
for normal and curvature estimation and sharp feature detection
Veni Vidi Vici, A Three-Phase Scenario For Parameter Space Analysis in Image Analysis and Visualization
Automatic analysis of the enormous sets of images is a critical task in life
sciences. This faces many challenges such as: algorithms are highly
parameterized, significant human input is intertwined, and lacking a standard
meta-visualization approach. This paper proposes an alternative iterative
approach for optimizing input parameters, saving time by minimizing the user
involvement, and allowing for understanding the workflow of algorithms and
discovering new ones. The main focus is on developing an interactive
visualization technique that enables users to analyze the relationships between
sampled input parameters and corresponding output. This technique is
implemented as a prototype called Veni Vidi Vici, or "I came, I saw, I
conquered." This strategy is inspired by the mathematical formulas of numbering
computable functions and is developed atop ImageJ, a scientific image
processing program. A case study is presented to investigate the proposed
framework. Finally, the paper explores some potential future issues in the
application of the proposed approach in parameter space analysis in
visualization
Making dynamic modelling effective in economics
Mathematics has been extremely effective in physics, but not in economics beyond finance. To establish economics as science we should follow the Galilean method and try to deduce mathematical models of markets from empirical data, as has been done for financial markets. Financial markets are nonstationary. This means that 'value' is subjective. Nonstationarity also means that the form of the noise in a market cannot be postulated a priroi, but must be deduced from the empirical data. I discuss the essence of complexity in a market as unexpected events, and end with a biological speculation about market growth.Economics; fniancial markets; stochastic process; Markov process; complex systems
On Convex Envelopes and Regularization of Non-Convex Functionals without moving Global Minima
We provide theory for the computation of convex envelopes of non-convex
functionals including an l2-term, and use these to suggest a method for
regularizing a more general set of problems. The applications are particularly
aimed at compressed sensing and low rank recovery problems but the theory
relies on results which potentially could be useful also for other types of
non-convex problems. For optimization problems where the l2-term contains a
singular matrix we prove that the regularizations never move the global minima.
This result in turn relies on a theorem concerning the structure of convex
envelopes which is interesting in its own right. It says that at any point
where the convex envelope does not touch the non-convex functional we
necessarily have a direction in which the convex envelope is affine.Comment: arXiv admin note: text overlap with arXiv:1609.0937
- …