5,837 research outputs found
Noise stability is computable and approximately low-dimensional
Questions of noise stability play an important role in hardness of approximation in computer science as well as in the theory of voting. In many applications, the goal is to find an optimizer of noise stability among all possible partitions of R[superscript n] for n ≥ 1 to k parts with given Gaussian measures μ[superscript 1], . . . , μ[superscript k]. We call a partition ϵ-optimal, if its noise stability is optimal up to an additive ϵ. In this paper, we give an explicit, computable function n(ϵ) such that an ϵ-optimal partition exists in R[superscript n(ϵ)]. This result has implications for the computability of certain problems in non-interactive simulation, which are addressed in a subsequent work. Keywords: Gaussian noise stability; Plurality is stablest; Ornstein Uhlenbeck operatorNational Science Foundation (U.S.) (Award CCF 1320105)United States. Office of Naval Research (Grant N00014-16-1-2227
Non interactive simulation of correlated distributions is decidable
A basic problem in information theory is the following: Let be an arbitrary distribution where the marginals
and are (potentially) correlated. Let Alice and Bob
be two players where Alice gets samples and Bob gets
samples and for all , . What
joint distributions can be simulated by Alice and Bob without any
interaction?
Classical works in information theory by G{\'a}cs-K{\"o}rner and Wyner answer
this question when at least one of or is the
distribution on where each marginal is unbiased and
identical. However, other than this special case, the answer to this question
is understood in very few cases. Recently, Ghazi, Kamath and Sudan showed that
this problem is decidable for supported on . We extend their result to supported on any finite
alphabet.
We rely on recent results in Gaussian geometry (by the authors) as well as a
new \emph{smoothing argument} inspired by the method of \emph{boosting} from
learning theory and potential function arguments from complexity theory and
additive combinatorics.Comment: The reduction for non-interactive simulation for general source
distribution to the Gaussian case was incorrect in the previous version. It
has been rectified no
Making dynamic modelling effective in economics
Mathematics has been extremely effective in physics, but not in economics beyond finance. To establish economics as science we should follow the Galilean method and try to deduce mathematical models of markets from empirical data, as has been done for financial markets. Financial markets are nonstationary. This means that 'value' is subjective. Nonstationarity also means that the form of the noise in a market cannot be postulated a priroi, but must be deduced from the empirical data. I discuss the essence of complexity in a market as unexpected events, and end with a biological speculation about market growth.Economics; fniancial markets; stochastic process; Markov process; complex systems
How Quantum Computers Fail: Quantum Codes, Correlations in Physical Systems, and Noise Accumulation
The feasibility of computationally superior quantum computers is one of the
most exciting and clear-cut scientific questions of our time. The question
touches on fundamental issues regarding probability, physics, and
computability, as well as on exciting problems in experimental physics,
engineering, computer science, and mathematics. We propose three related
directions towards a negative answer. The first is a conjecture about physical
realizations of quantum codes, the second has to do with correlations in
stochastic physical systems, and the third proposes a model for quantum
evolutions when noise accumulates. The paper is dedicated to the memory of
Itamar Pitowsky.Comment: 16 page
Evaluating Data Assimilation Algorithms
Data assimilation leads naturally to a Bayesian formulation in which the
posterior probability distribution of the system state, given the observations,
plays a central conceptual role. The aim of this paper is to use this Bayesian
posterior probability distribution as a gold standard against which to evaluate
various commonly used data assimilation algorithms.
A key aspect of geophysical data assimilation is the high dimensionality and
low predictability of the computational model. With this in mind, yet with the
goal of allowing an explicit and accurate computation of the posterior
distribution, we study the 2D Navier-Stokes equations in a periodic geometry.
We compute the posterior probability distribution by state-of-the-art
statistical sampling techniques. The commonly used algorithms that we evaluate
against this accurate gold standard, as quantified by comparing the relative
error in reproducing its moments, are 4DVAR and a variety of sequential
filtering approximations based on 3DVAR and on extended and ensemble Kalman
filters.
The primary conclusions are that: (i) with appropriate parameter choices,
approximate filters can perform well in reproducing the mean of the desired
probability distribution; (ii) however they typically perform poorly when
attempting to reproduce the covariance; (iii) this poor performance is
compounded by the need to modify the covariance, in order to induce stability.
Thus, whilst filters can be a useful tool in predicting mean behavior, they
should be viewed with caution as predictors of uncertainty. These conclusions
are intrinsic to the algorithms and will not change if the model complexity is
increased, for example by employing a smaller viscosity, or by using a detailed
NWP model
Importance sampling for thermally induced switching and non-switching probabilities in spin-torque magnetic nanodevices
Spin-transfer torque magnetoresistive random access memory is a potentially
transformative technology in the non-volatile memory market. Its viability
depends, in part, on one's ability to predictably induce or prevent switching;
however, thermal fluctuations cause small but important errors in both the
writing and reading processes. Computing these very small probabilities for
magnetic nanodevices using naive Monte Carlo simulations is essentially
impossible due to their slow statistical convergence, but variance reduction
techniques can offer an effective way to improve their efficiency. Here, we
provide an illustration of how importance sampling can be efficiently used to
estimate low read and write soft error rates of macrospin and coupled-spin
systems.Comment: 11 pages, 14 figure
- …