78,858 research outputs found
Information-theoretic bound on the energy cost of stochastic simulation
Physical systems are often simulated using a stochastic computation where
different final states result from identical initial states. Here, we derive
the minimum energy cost of simulating a complex data set of a general physical
system with a stochastic computation. We show that the cost is proportional to
the difference between two information-theoretic measures of complexity of the
data - the statistical complexity and the predictive information. We derive the
difference as the amount of information erased during the computation. Finally,
we illustrate the physics of information by implementing the stochastic
computation as a Gedankenexperiment of a Szilard-type engine. The results
create a new link between thermodynamics, information theory, and complexity.Comment: 5 pages, 1 figur
On the Communication Complexity of Secure Computation
Information theoretically secure multi-party computation (MPC) is a central
primitive of modern cryptography. However, relatively little is known about the
communication complexity of this primitive.
In this work, we develop powerful information theoretic tools to prove lower
bounds on the communication complexity of MPC. We restrict ourselves to a
3-party setting in order to bring out the power of these tools without
introducing too many complications. Our techniques include the use of a data
processing inequality for residual information - i.e., the gap between mutual
information and G\'acs-K\"orner common information, a new information
inequality for 3-party protocols, and the idea of distribution switching by
which lower bounds computed under certain worst-case scenarios can be shown to
apply for the general case.
Using these techniques we obtain tight bounds on communication complexity by
MPC protocols for various interesting functions. In particular, we show
concrete functions that have "communication-ideal" protocols, which achieve the
minimum communication simultaneously on all links in the network. Also, we
obtain the first explicit example of a function that incurs a higher
communication cost than the input length in the secure computation model of
Feige, Kilian and Naor (1994), who had shown that such functions exist. We also
show that our communication bounds imply tight lower bounds on the amount of
randomness required by MPC protocols for many interesting functions.Comment: 37 page
Tight Limits on Nonlocality from Nontrivial Communication Complexity; a.k.a. Reliable Computation with Asymmetric Gate Noise
It has long been known that the existence of certain superquantum nonlocal
correlations would cause communication complexity to collapse. The absurdity of
a world in which any nonlocal binary function could be evaluated with a
constant amount of communication in turn provides a tantalizing way to
distinguish quantum mechanics from incorrect theories of physics; the statement
"communication complexity is nontrivial" has even been conjectured to be a
concise information-theoretic axiom for characterizing quantum mechanics. We
directly address the viability of that perspective with two results. First, we
exhibit a nonlocal game such that communication complexity collapses in any
physical theory whose maximal winning probability exceeds the quantum value.
Second, we consider the venerable CHSH game that initiated this line of
inquiry. In that case, the quantum value is about 0.85 but it is known that a
winning probability of approximately 0.91 would collapse communication
complexity. We show that the 0.91 result is the best possible using a large
class of proof strategies, suggesting that the communication complexity axiom
is insufficient for characterizing CHSH correlations. Both results build on new
insights about reliable classical computation. The first exploits our
formalization of an equivalence between amplification and reliable computation,
while the second follows from a rigorous determination of the threshold for
reliable computation with formulas of noise-free XOR gates and
-noisy AND gates.Comment: 64 pages, 6 figure
Estimating the Expected Value of Partial Perfect Information in Health Economic Evaluations using Integrated Nested Laplace Approximation
The Expected Value of Perfect Partial Information (EVPPI) is a
decision-theoretic measure of the "cost" of parametric uncertainty in decision
making used principally in health economic decision making. Despite this
decision-theoretic grounding, the uptake of EVPPI calculations in practice has
been slow. This is in part due to the prohibitive computational time required
to estimate the EVPPI via Monte Carlo simulations. However, recent developments
have demonstrated that the EVPPI can be estimated by non-parametric regression
methods, which have significantly decreased the computation time required to
approximate the EVPPI. Under certain circumstances, high-dimensional Gaussian
Process regression is suggested, but this can still be prohibitively expensive.
Applying fast computation methods developed in spatial statistics using
Integrated Nested Laplace Approximations (INLA) and projecting from a
high-dimensional into a low-dimensional input space allows us to decrease the
computation time for fitting these high-dimensional Gaussian Processes, often
substantially. We demonstrate that the EVPPI calculated using our method for
Gaussian Process regression is in line with the standard Gaussian Process
regression method and that despite the apparent methodological complexity of
this new method, R functions are available in the package BCEA to implement it
simply and efficiently
Separating decision tree complexity from subcube partition complexity
The subcube partition model of computation is at least as powerful as
decision trees but no separation between these models was known. We show that
there exists a function whose deterministic subcube partition complexity is
asymptotically smaller than its randomized decision tree complexity, resolving
an open problem of Friedgut, Kahn, and Wigderson (2002). Our lower bound is
based on the information-theoretic techniques first introduced to lower bound
the randomized decision tree complexity of the recursive majority function.
We also show that the public-coin partition bound, the best known lower bound
method for randomized decision tree complexity subsuming other general
techniques such as block sensitivity, approximate degree, randomized
certificate complexity, and the classical adversary bound, also lower bounds
randomized subcube partition complexity. This shows that all these lower bound
techniques cannot prove optimal lower bounds for randomized decision tree
complexity, which answers an open question of Jain and Klauck (2010) and Jain,
Lee, and Vishnoi (2014).Comment: 16 pages, 1 figur
Efficient Privacy Preserving Distributed Clustering Based on Secret Sharing
In this paper, we propose a privacy preserving distributed
clustering protocol for horizontally partitioned data based on a very efficient
homomorphic additive secret sharing scheme. The model we use
for the protocol is novel in the sense that it utilizes two non-colluding
third parties. We provide a brief security analysis of our protocol from
information theoretic point of view, which is a stronger security model.
We show communication and computation complexity analysis of our
protocol along with another protocol previously proposed for the same
problem. We also include experimental results for computation and communication
overhead of these two protocols. Our protocol not only outperforms
the others in execution time and communication overhead on
data holders, but also uses a more efficient model for many data mining
applications
- …