24 research outputs found

    The Value of Information for Populations in Varying Environments

    Full text link
    The notion of information pervades informal descriptions of biological systems, but formal treatments face the problem of defining a quantitative measure of information rooted in a concept of fitness, which is itself an elusive notion. Here, we present a model of population dynamics where this problem is amenable to a mathematical analysis. In the limit where any information about future environmental variations is common to the members of the population, our model is equivalent to known models of financial investment. In this case, the population can be interpreted as a portfolio of financial assets and previous analyses have shown that a key quantity of Shannon's communication theory, the mutual information, sets a fundamental limit on the value of information. We show that this bound can be violated when accounting for features that are irrelevant in finance but inherent to biological systems, such as the stochasticity present at the individual level. This leads us to generalize the measures of uncertainty and information usually encountered in information theory

    On intersections of interval graphs

    Get PDF
    AbstractIf one can associate with each vertex of a graph an interval of a line, so that two intervals intersect just when the corresponding vertices are joined by an edge, then one speaks of an interval graph.It is shown that any graph on v vertices is the intersection (“product”) of at most [12v] interval graphs on the same vertex set.For v=2k, k factors are necessary for, and only for, the complete k-partite graph K2,2,…,2.Some results for the hypergraph generalization of this question are also obtained

    Separation of estimation and control for discrete time systems

    No full text

    On intersections of interval graphs

    No full text

    On Woodall's interval problem

    No full text

    An information theoretic tradeoff between complexity and accuracy

    No full text
    A fundamental question in learning theory is the quantification of the basic tradeoff between the complexity of a model and its predictive accuracy. One valid way of quantifying this tradeoff, known as the "Information Bottleneck", is to measure both the complexity of the model and its prediction accuracy by using Shannon's mutual information. In this paper we show that the Information Bottleneck framework answers a well defined and known coding problem and at same time it provides a general relationship between complexity and prediction accuracy, measured by mutual information. We study the nature of this complexity-accuracy tradeoff and discuss some of its theoretical properties. Furthermore, we present relations to classical information theoretic problems, such as rate-distortion theory, cost-capacity tradeoff and source coding with side information

    Informational Issues in Decentralized Control

    No full text
    corecore