104 research outputs found

    Influence of gene expression gradients on positional information content in fly embryos

    Get PDF
    Thesis: S.B., Massachusetts Institute of Technology, Department of Physics, 2018.Cataloged from PDF version of thesis.Includes bibliographical references (pages 49-51).The concept of positional information was introduced to qualitatively explain how individual cells are involved in forming patterns. Recent experimental and theoretical developments have made studying specific biological systems in a quantitative manner possible using the framework of positional information. Much previous work has focused on using the full gene expression profiles when calculating the available positional information. In an attempt to simplify the model a discretized version, where the gene expression profiles are simplified to a binary system, was proposed. Binarizing, however, results in a significant loss of information over using the full profiles. The question remains how coarsely can we discretize the full model without losing essential positional information. Recent work has shown the importance of concentration gradients in impacting the folding of proteins during embryonic development. Based on this work we posit that the gradients of gene profiles might be an important addition to the discretized model. Using data provided by the Gregor lab at Princeton University we test this hypothesis on the gap gene network of Drosophilia embryos. In order to implement the addition of gradients to the positional information requires producing an algorithm that can efficiently take meaningful derivatives of noisy data, which is done using Chebyshev interpolation. An adaptation of Monte Carlo methods to find maxima of multidimensional functions is also implemented. We find that the derivatives can account for over one bit of the information lost by the discretization process. Allowing the cells to locate themselves with an average precision close to one internuclear spacing. This suggests that a binary model using gradients may be almost as efficient as the model that uses the full gene profiles. We propose that a discrete model of positional information that includes gradients does not lose significant information over a model that uses full profiles.by Alasdair Hastewell.S.B

    ASYMPTOTIC NORMALITY OF ENTROPY ESTIMATORS

    Get PDF
    Shannon's entropy plays a central role in many fields of mathematics. In the first chapter, we present a sufficient condition for the asymptotic normality of the plug-in estimator of Shannon's entropy defined on a countable alphabet. The sufficient condition covers a range of cases with countably infinite alphabets, for which no normality results were previously known. In the second chapter of this dissertation, we establish the asymptotic normality of a recently introduced non-parametric entropy estimator under another sufficient condition. The proposed estimator, developed in Turing's perspective, is known for its improved estimation accuracy

    The External Tape Hypothesis: a Turing machine based approach to cognitive computation

    Get PDF
    The symbol processing or "classical cognitivist" approach to mental computation suggests that the cognitive architecture operates rather like a digital computer. The components of the architecture are input, output and central systems. The input and output systems communicate with both the internal and external environments of the cognizer and transmit codes to and from the rule governed, central processing system which operates on structured representational expressions in the internal environment. The connectionist approach, by contrast, suggests that the cognitive architecture should be thought of as a network of interconnected neuron-like processing elements (nodes) which operates rather like a brain. Connectionism distinguishes input, output and central or "hidden" layers of nodes. Connectionists claim that internal processing consists not of the rule governed manipulation of structured symbolic expressions, but of the excitation and inhibition of activity and the alteration of connection strengths via message passing within and between layers of nodes in the network. A central claim of the thesis is that neither symbol processing nor connectionism provides an adequate characterization of the role of the external environment in cognitive computation. An alternative approach, called the External Tape Hypothesis (ETH), is developed which claims, on the basis of Turing's analysis of routine computation, that the Turing machine model can be used as the basis for a theory which includes the environment as an essential part of the cognitive architecture. The environment is thought of as the tape, and the brain as the control of a Turing machine. Finite state automata, Turing machines, and universal Turing machines are described, including details of Turing's original universal machine construction. A short account of relevant aspects of the history of digital computation is followed by a critique of the symbol processing approach as it is construed by influential proponents such as Allen Newell and Zenon Pylyshyn among others. The External Tape Hypothesis is then developed as an alternative theoretical basis. In the final chapter, the ETH is combined with the notion of a self-describing Turing machine to provide the basis for an account of thinking and the development of internal representations

    Information Processing, Computation and Cognition

    Get PDF
    Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both – although others disagree vehemently. Yet different cognitive scientists use ‘computation’ and ‘information processing’ to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and foremost, that they are the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism and connectionism/computational neuroscience on the other. We defend the relevance to cognitive science of both computation, at least in a generic sense, and information processing, in three important senses of the term. Our account advances several foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debates’ empirical aspects

    Possibility to Realize an Accelerated Turing Machine

    Get PDF
    On the basis of a theorem, in which an evanescent photon is a superluminal particle, the author considers the possibility of realizing a high performance computer system compared with conventional silicon processors. To realize such a quantum computer utilizing evanescent photons, we must replace electronic components with optical ones, and thus an equivalent optical transistor is required. This critical component for quantum computing can be created using meta-material circuits with a non-linear refractive index. Based on this optical computer system, utilizing meta-material technology, it can be shown that superluminal computation, which is a new concept for an accelerated Turing machine, can be realized in the physical world

    Marriages of Mathematics and Physics: A Challenge for Biology

    Get PDF
    The human attempts to access, measure and organize physical phenomena have led to a manifold construction of mathematical and physical spaces. We will survey the evolution of geometries from Euclid to the Algebraic Geometry of the 20th century. The role of Persian/Arabic Algebra in this transition and its Western symbolic development is emphasized. In this relation, we will also discuss changes in the ontological attitudes toward mathematics and its applications. Historically, the encounter of geometric and algebraic perspectives enriched the mathematical practices and their foundations. Yet, the collapse of Euclidean certitudes, of over 2300 years, and the crisis in the mathematical analysis of the 19th century, led to the exclusion of “geometric judgments” from the foundations of Mathematics. After the success and the limits of the logico-formal analysis, it is necessary to broaden our foundational tools and re-examine the interactions with natural sciences. In particular, the way the geometric and algebraic approaches organize knowledge is analyzed as a cross-disciplinary and cross-cultural issue and will be examined in Mathematical Physics and Biology. We finally discuss how the current notions of mathematical (phase) “space” should be revisited for the purposes of life sciences

    M-lattice, a system for signal synthesis and processing based on reaction-diffusion

    Get PDF
    Thesis (Sc. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1994.Includes bibliographical references (leaves 148-154).by Alexander Semyon Sherstinsky.Sc.D

    Legal Personhood for Artificial Intelligences

    Get PDF
    • 

    corecore