379 research outputs found

    Field Theoretical Analysis of On-line Learning of Probability Distributions

    Full text link
    On-line learning of probability distributions is analyzed from the field theoretical point of view. We can obtain an optimal on-line learning algorithm, since renormalization group enables us to control the number of degrees of freedom of a system according to the number of examples. We do not learn parameters of a model, but probability distributions themselves. Therefore, the algorithm requires no a priori knowledge of a model.Comment: 4 pages, 1 figure, RevTe

    The role of input noise in transcriptional regulation

    Get PDF
    Even under constant external conditions, the expression levels of genes fluctuate. Much emphasis has been placed on the components of this noise that are due to randomness in transcription and translation; here we analyze the role of noise associated with the inputs to transcriptional regulation, the random arrival and binding of transcription factors to their target sites along the genome. This noise sets a fundamental physical limit to the reliability of genetic control, and has clear signatures, but we show that these are easily obscured by experimental limitations and even by conventional methods for plotting the variance vs. mean expression level. We argue that simple, global models of noise dominated by transcription and translation are inconsistent with the embedding of gene expression in a network of regulatory interactions. Analysis of recent experiments on transcriptional control in the early Drosophila embryo shows that these results are quantitatively consistent with the predicted signatures of input noise, and we discuss the experiments needed to test the importance of input noise more generally.Comment: 11 pages, 5 figures minor correction

    Entropy and information in neural spike trains: Progress on the sampling problem

    Full text link
    The major problem in information theoretic analysis of neural responses and other biological data is the reliable estimation of entropy--like quantities from small samples. We apply a recently introduced Bayesian entropy estimator to synthetic data inspired by experiments, and to real experimental spike trains. The estimator performs admirably even very deep in the undersampled regime, where other techniques fail. This opens new possibilities for the information theoretic analysis of experiments, and may be of general interest as an example of learning from limited data.Comment: 7 pages, 4 figures; referee suggested changes, accepted versio

    Optimizing information flow in small genetic networks. II: Feed forward interactions

    Get PDF
    Central to the functioning of a living cell is its ability to control the readout or expression of information encoded in the genome. In many cases, a single transcription factor protein activates or represses the expression of many genes. As the concentration of the transcription factor varies, the target genes thus undergo correlated changes, and this redundancy limits the ability of the cell to transmit information about input signals. We explore how interactions among the target genes can reduce this redundancy and optimize information transmission. Our discussion builds on recent work [Tkacik et al, Phys Rev E 80, 031920 (2009)], and there are connections to much earlier work on the role of lateral inhibition in enhancing the efficiency of information transmission in neural circuits; for simplicity we consider here the case where the interactions have a feed forward structure, with no loops. Even with this limitation, the networks that optimize information transmission have a structure reminiscent of the networks found in real biological systems

    Shannon Meets Carnot: Generalized Second Thermodynamic Law

    Full text link
    The classical thermodynamic laws fail to capture the behavior of systems with energy Hamiltonian which is an explicit function of the temperature. Such Hamiltonian arises, for example, in modeling information processing systems, like communication channels, as thermal systems. Here we generalize the second thermodynamic law to encompass systems with temperature-dependent energy levels, dQ=TdS+dTdQ=TdS+dT, where denotes averaging over the Boltzmann distribution and reveal a new definition to the basic notion of temperature. This generalization enables to express, for instance, the mutual information of the Gaussian channel as a consequence of the fundamental laws of nature - the laws of thermodynamics

    Information capacity of genetic regulatory elements

    Full text link
    Changes in a cell's external or internal conditions are usually reflected in the concentrations of the relevant transcription factors. These proteins in turn modulate the expression levels of the genes under their control and sometimes need to perform non-trivial computations that integrate several inputs and affect multiple genes. At the same time, the activities of the regulated genes would fluctuate even if the inputs were held fixed, as a consequence of the intrinsic noise in the system, and such noise must fundamentally limit the reliability of any genetic computation. Here we use information theory to formalize the notion of information transmission in simple genetic regulatory elements in the presence of physically realistic noise sources. The dependence of this "channel capacity" on noise parameters, cooperativity and cost of making signaling molecules is explored systematically. We find that, at least in principle, capacities higher than one bit should be achievable and that consequently genetic regulation is not limited the use of binary, or "on-off", components.Comment: 17 pages, 9 figure

    A Theory of Cheap Control in Embodied Systems

    Full text link
    We present a framework for designing cheap control architectures for embodied agents. Our derivation is guided by the classical problem of universal approximation, whereby we explore the possibility of exploiting the agent's embodiment for a new and more efficient universal approximation of behaviors generated by sensorimotor control. This embodied universal approximation is compared with the classical non-embodied universal approximation. To exemplify our approach, we present a detailed quantitative case study for policy models defined in terms of conditional restricted Boltzmann machines. In contrast to non-embodied universal approximation, which requires an exponential number of parameters, in the embodied setting we are able to generate all possible behaviors with a drastically smaller model, thus obtaining cheap universal approximation. We test and corroborate the theory experimentally with a six-legged walking machine. The experiments show that the sufficient controller complexity predicted by our theory is tight, which means that the theory has direct practical implications. Keywords: cheap design, embodiment, sensorimotor loop, universal approximation, conditional restricted Boltzmann machineComment: 27 pages, 10 figure

    Information transmission in genetic regulatory networks: a review

    Full text link
    Genetic regulatory networks enable cells to respond to the changes in internal and external conditions by dynamically coordinating their gene expression profiles. Our ability to make quantitative measurements in these biochemical circuits has deepened our understanding of what kinds of computations genetic regulatory networks can perform and with what reliability. These advances have motivated researchers to look for connections between the architecture and function of genetic regulatory networks. Transmitting information between network's inputs and its outputs has been proposed as one such possible measure of function, relevant in certain biological contexts. Here we summarize recent developments in the application of information theory to gene regulatory networks. We first review basic concepts in information theory necessary to understand recent work. We then discuss the functional complexity of gene regulation which arrises from the molecular nature of the regulatory interactions. We end by reviewing some experiments supporting the view that genetic networks responsible for early development of multicellular organisms might be maximizing transmitted 'positional' information.Comment: Submitted to J Phys: Condens Matter, 31 page

    Weak pairwise correlations imply strongly correlated network states in a neural population

    Get PDF
    Biological networks have so many possible states that exhaustive sampling is impossible. Successful analysis thus depends on simplifying hypotheses, but experiments on many systems hint that complicated, higher order interactions among large groups of elements play an important role. In the vertebrate retina, we show that weak correlations between pairs of neurons coexist with strongly collective behavior in the responses of ten or more neurons. Surprisingly, we find that this collective behavior is described quantitatively by models that capture the observed pairwise correlations but assume no higher order interactions. These maximum entropy models are equivalent to Ising models, and predict that larger networks are completely dominated by correlation effects. This suggests that the neural code has associative or error-correcting properties, and we provide preliminary evidence for such behavior. As a first test for the generality of these ideas, we show that similar results are obtained from networks of cultured cortical neurons.Comment: Full account of work presented at the conference on Computational and Systems Neuroscience (COSYNE), 17-20 March 2005, in Salt Lake City, Utah (http://cosyne.org
    • …
    corecore