1,923 research outputs found

    How much information can one bit of memory retain about a Bernoulli sequence?

    Get PDF
    The maximin problem of the maximization of the minimum amount of information that a single bit of memory retains about the entire past is investigated. The problem is to estimate the supremum over all possible sequences of update rules of the minimum information that the bit of memory at epoch (n+1) retains about the previous n inputs. Using only elementary techniques, it is shown that the maximin covariance between the memory at epoch (n+1) and past inputs is Θ(1/n), the maximum average covariance is Θ(1/n), and the maximin mutual information is Ω(1/n^2). In a consideration of related issues, the authors also provide an exact count of the number of Boolean functions of n variables that can be obtained recursively from Boolean functions of two variables, discuss extensions and applications of the original problem, and indicate links with issues in neural computation

    How much information can one bit of memory retain about a Bernoulli sequence?

    Get PDF
    The maximin problem of the maximization of the minimum amount of information that a single bit of memory retains about the entire past is investigated. The problem is to estimate the supremum over all possible sequences of update rules of the minimum information that the bit of memory at epoch (n+1) retains about the previous n inputs. Using only elementary techniques, it is shown that the maximin covariance between the memory at epoch (n+1) and past inputs is Θ(1/n), the maximum average covariance is Θ(1/n), and the maximin mutual information is Ω(1/n^2). In a consideration of related issues, the authors also provide an exact count of the number of Boolean functions of n variables that can be obtained recursively from Boolean functions of two variables, discuss extensions and applications of the original problem, and indicate links with issues in neural computation

    A Tutorial on Fisher Information

    Get PDF
    In many statistical applications that concern mathematical psychologists, the concept of Fisher information plays an important role. In this tutorial we clarify the concept of Fisher information as it manifests itself across three different statistical paradigms. First, in the frequentist paradigm, Fisher information is used to construct hypothesis tests and confidence intervals using maximum likelihood estimators; second, in the Bayesian paradigm, Fisher information is used to define a default prior; lastly, in the minimum description length paradigm, Fisher information is used to measure model complexity

    The Computational Structure of Spike Trains

    Full text link
    Neurons perform computations, and convey the results of those computations through the statistical structure of their output spike trains. Here we present a practical method, grounded in the information-theoretic analysis of prediction, for inferring a minimal representation of that structure and for characterizing its complexity. Starting from spike trains, our approach finds their causal state models (CSMs), the minimal hidden Markov models or stochastic automata capable of generating statistically identical time series. We then use these CSMs to objectively quantify both the generalizable structure and the idiosyncratic randomness of the spike train. Specifically, we show that the expected algorithmic information content (the information needed to describe the spike train exactly) can be split into three parts describing (1) the time-invariant structure (complexity) of the minimal spike-generating process, which describes the spike train statistically; (2) the randomness (internal entropy rate) of the minimal spike-generating process; and (3) a residual pure noise term not described by the minimal spike-generating process. We use CSMs to approximate each of these quantities. The CSMs are inferred nonparametrically from the data, making only mild regularity assumptions, via the causal state splitting reconstruction algorithm. The methods presented here complement more traditional spike train analyses by describing not only spiking probability and spike train entropy, but also the complexity of a spike train's structure. We demonstrate our approach using both simulated spike trains and experimental data recorded in rat barrel cortex during vibrissa stimulation.Comment: Somewhat different format from journal version but same conten

    Hash Functions for Episodic Recognition and Retrieval

    Get PDF
    Episodic memory systems for artificially intelligent agents must cope with an ever-growing episodic memory store. This paper presents an approach for minimizing the size of the store by using specialized hash functions to convert each memory into a relatively short binary code. A set of desiderata for such hash functions are presented including locale sensitivity and reversibility. The paper then introduces multiple approaches for such functions and compares their effectiveness

    A generalized birthday approach for efficiently finding linear relations in l-sequences

    Get PDF
    Feedback with carry shift registers (FCSRs) have previously been available in two configurations, the Fibonacci and Galois architectures. Recently, a generalized and unifying FCSR structure and theory was presented. The new ring FCSR model repairs some weaknesses of the older architectures. Most notably, the carry cell bias property that was exploited for an attack on the eSTREAM final portfolio cipher F-FCSR-H v2 is no longer possible for the updated (and unbroken) F-FCSR-H v3 stream cipher. In this paper we show how to exploit a particular set of linear relations in ring FCSR sequences. We show what biases can be expected, and we also present a generalized birthday algorithm for actually realizing these relations. As all prerequisites of a distinguishing attack are present, we explicitly show a new such attack on F-FCSR-H v3 with an online time complexity of only 2^{37.2}. The offline time complexity (for finding a linear relation) is 2^{56.2}. This is the first successful attack on F-FCSR-H v3, the first attack to breach the exhaustive search complexity limit. Note that this attack is completely different from that of F-FCSR-H v2. We focus on this particular application in the paper, but the presented algorithm is actually very general. The algorithm can be applied to any FCSR automaton, so linearly filtered FCSRs and FCSR combiners may be particularly interesting targets for cryptanalysis
    • …
    corecore