1,231,059 research outputs found
Information Theory and Noisy Computation
We report on two types of results. The first is a study of the rate of decay of information carried by a signal which is being propagated over a noisy channel. The second is a series of lower bounds on the depth, size, and component reliability of noisy logic circuits which are required to compute some function reliably. The arguments used for the circuit results are information-theoretic, and in particular, the signal decay result is essential to the depth lower bound. Our first result can be viewed as a quantified version of the data processing lemma, for the case of Boolean random variables
Information Processing, Computation and Cognition
Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both ā although others disagree vehemently. Yet different cognitive scientists use ācomputationā and āinformation processingā to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and foremost, that they are the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism and connectionism/computational neuroscience on the other. We defend the relevance to cognitive science of both computation, at least in a generic sense, and information processing, in three important senses of the term. Our account advances several foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debatesā empirical aspects
Quantum computation and the physical computation level of biological information processing
On the basis of introspective analysis, we establish a crucial requirement
for the physical computation basis of consciousness: it should allow processing
a significant amount of information together at the same time. Classical
computation does not satisfy the requirement. At the fundamental physical
level, it is a network of two body interactions, each the input-output
transformation of a universal Boolean gate. Thus, it cannot process together at
the same time more than the three bit input of this gate - many such gates in
parallel do not count since the information is not processed together. Quantum
computation satisfies the requirement. At the light of our recent explanation
of the speed up, quantum measurement of the solution of the problem is
analogous to a many body interaction between the parts of a perfect classical
machine, whose mechanical constraints represent the problem to be solved. The
many body interaction satisfies all the constraints together at the same time,
producing the solution in one shot. This shades light on the physical
computation level of the theories that place consciousness in quantum
measurement and explains how informations coming from disparate sensorial
channels come together in the unity of subjective experience. The fact that the
fundamental mechanism of consciousness is the same of the quantum speed up,
gives quantum consciousness a potentially enormous evolutionary advantage.Comment: 13 page
Security Games with Information Leakage: Modeling and Computation
Most models of Stackelberg security games assume that the attacker only knows
the defender's mixed strategy, but is not able to observe (even partially) the
instantiated pure strategy. Such partial observation of the deployed pure
strategy -- an issue we refer to as information leakage -- is a significant
concern in practical applications. While previous research on patrolling games
has considered the attacker's real-time surveillance, our settings, therefore
models and techniques, are fundamentally different. More specifically, after
describing the information leakage model, we start with an LP formulation to
compute the defender's optimal strategy in the presence of leakage. Perhaps
surprisingly, we show that a key subproblem to solve this LP (more precisely,
the defender oracle) is NP-hard even for the simplest of security game models.
We then approach the problem from three possible directions: efficient
algorithms for restricted cases, approximation algorithms, and heuristic
algorithms for sampling that improves upon the status quo. Our experiments
confirm the necessity of handling information leakage and the advantage of our
algorithms
Information-theoretic temporal Bell inequality and quantum computation
An information-theoretic temporal Bell inequality is formulated to contrast
classical and quantum computations. Any classical algorithm satisfies the
inequality, while quantum ones can violate it. Therefore, the violation of the
inequality is an immediate consequence of the quantumness in the computation.
Furthermore, this approach suggests a notion of temporal nonlocality in quantum
computation.Comment: v2: 5 pages, refereces added, discussion slightly revised, main
result unchanged. v3: typos correcte
Recommended from our members
Information and Computation
Our work is based on two theses: (1) most problems are approximately solved; that is, we live with uncertainty; (2) for problems with partial or approximate information, the usual algorithm-centered approach can be supplemented, and sometimes replaced, by the information-centered approach. We briefly discuss these two theses here. Much of this article will be devoted to their expansion, illustrated by numerous examples
- ā¦