8,272 research outputs found

    Biologically inspired distributed machine cognition: a new formal approach to hyperparallel computation

    Get PDF
    The irresistable march toward multiple-core chip technology presents currently intractable pdrogramming challenges. High level mental processes in many animals, and their analogs for social structures, appear similarly massively parallel, and recent mathematical models addressing them may be adaptable to the multi-core programming problem

    Equation-free modeling of evolving diseases: Coarse-grained computations with individual-based models

    Full text link
    We demonstrate how direct simulation of stochastic, individual-based models can be combined with continuum numerical analysis techniques to study the dynamics of evolving diseases. % Sidestepping the necessity of obtaining explicit population-level models, the approach analyzes the (unavailable in closed form) `coarse' macroscopic equations, estimating the necessary quantities through appropriately initialized, short `bursts' of individual-based dynamic simulation. % We illustrate this approach by analyzing a stochastic and discrete model for the evolution of disease agents caused by point mutations within individual hosts. % Building up from classical SIR and SIRS models, our example uses a one-dimensional lattice for variant space, and assumes a finite number of individuals. % Macroscopic computational tasks enabled through this approach include stationary state computation, coarse projective integration, parametric continuation and stability analysis.Comment: 16 pages, 8 figure

    Gene Expression and its Discontents: Developmental disorders as dysfunctions of epigenetic cognition

    Get PDF
    Systems biology presently suffers the same mereological and sufficiency fallacies that haunt neural network models of high order cognition. Shifting perspective from the massively parallel space of gene matrix interactions to the grammar/syntax of the time series of expressed phenotypes using a cognitive paradigm permits import of techniques from statistical physics via the homology between information source uncertainty and free energy density. This produces a broad spectrum of possible statistical models of development and its pathologies in which epigenetic regulation and the effects of embedding environment are analogous to a tunable enzyme catalyst. A cognitive paradigm naturally incorporates memory, leading directly to models of epigenetic inheritance, as affected by environmental exposures, in the largest sense. Understanding gene expression, development, and their dysfunctions will require data analysis tools considerably more sophisticated than the present crop of simplistic models abducted from neural network studies or stochastic chemical reaction theory

    Dysfunctions of highly parallel real-time machines as 'developmental disorders': Security concerns and a Caveat Emptor

    Get PDF
    A cognitive paradigm for gene expression in developmental biology that is based on rigorous application of the asymptotic limit theorems of information theory can be adapted to highly parallel real-time computing. The coming Brave New World of massively parallel 'autonomic' and 'Self-X' machines driven by the explosion of multiple core and molecular computing technologies will not be spared patterns of canonical and idiosyncratic failure analogous to the developmental disorders affecting organisms that have had the relentless benefit of a billion years of evolutionary pruning. This paper provides a warning both to potential users of these machines and, given that many such disorders can be induced by external agents, to those concerned with larger scale matters of homeland security

    Lurching Toward Chernobyl: Dysfunctions of Real-Time Computation

    Get PDF
    Cognitive biological structures, social organizations, and computing machines operating in real time are subject to Rate Distortion Theorem constraints driven by the homology between information source uncertainty and free energy density. This exposes the unitary structure/environment system to a relentless entropic torrent compounded by sudden large deviations causing increased distortion between intent and impact, particularly as demands escalate. The phase transitions characteristic of information phenomena suggest that, rather than graceful decay under increasing load, these structures will undergo punctuated degradation akin to spontaneous symmetry breaking in physical systems. Rate distortion problems, that also affect internal structural dynamics, can become synergistic with limitations equivalent to the inattentional blindness of natural cognitive process. These mechanisms, and their interactions, are unlikely to scale well, so that, depending on architecture, enlarging the structure or its duties may lead to a crossover point at which added resources must be almost entirely devoted to ensuring system stability -- a form of allometric scaling familiar from biological examples. This suggests a critical need to tune architecture to problem type and system demand. A real-time computational structure and its environment are a unitary phenomenon, and environments are usually idiosyncratic. Thus the resulting path dependence in the development of pathology could often require an individualized approach to remediation more akin to an arduous psychiatric intervention than to the traditional engineering or medical quick fix. Failure to recognize the depth of these problems seems likely to produce a relentless chain of the Chernobyl-like failures that are necessary, bot often insufficient, for remediation under our system

    Institutional paraconsciousness and its pathologies

    Get PDF
    This analysis extends a recent mathematical treatment of the Baars consciousness model to analogous, but far more complicated, phenomena of institutional cognition. Individual consciousness is limited to a single, tunable, giant component of interacting cognitive modules, instantiating a Global Workspace. Human institutions, by contrast, support several, sometimes many, such giant components simultaneously, although their behavior remains constrained to a topology generated by cultural context and by the path-dependence inherent to organizational history. Such highly parallel multitasking - institutional paraconsciousness - while clearly limiting inattentional blindness and the consequences of failures within individual workspaces, does not eliminate them, and introduces new characteristic dysfunctions involving the distortion of information sent between global workspaces. Consequently, organizations (or machines designed along these principles), while highly efficient at certain kinds of tasks, remain subject to canonical and idiosyncratic failure patterns similar to, but more complicated than, those afflicting individuals. Remediation is complicated by the manner in which pathogenic externalities can write images of themselves on both institutional function and therapeutic intervention, in the context of relentless market selection pressures. The approach is broadly consonant with recent work on collective efficacy, collective consciousness, and distributed cognition

    Information-Geometric Optimization Algorithms: A Unifying Picture via Invariance Principles

    Get PDF
    We present a canonical way to turn any smooth parametric family of probability distributions on an arbitrary search space XX into a continuous-time black-box optimization method on XX, the \emph{information-geometric optimization} (IGO) method. Invariance as a design principle minimizes the number of arbitrary choices. The resulting \emph{IGO flow} conducts the natural gradient ascent of an adaptive, time-dependent, quantile-based transformation of the objective function. It makes no assumptions on the objective function to be optimized. The IGO method produces explicit IGO algorithms through time discretization. It naturally recovers versions of known algorithms and offers a systematic way to derive new ones. The cross-entropy method is recovered in a particular case, and can be extended into a smoothed, parametrization-independent maximum likelihood update (IGO-ML). For Gaussian distributions on Rd\mathbb{R}^d, IGO is related to natural evolution strategies (NES) and recovers a version of the CMA-ES algorithm. For Bernoulli distributions on {0,1}d\{0,1\}^d, we recover the PBIL algorithm. From restricted Boltzmann machines, we obtain a novel algorithm for optimization on {0,1}d\{0,1\}^d. All these algorithms are unified under a single information-geometric optimization framework. Thanks to its intrinsic formulation, the IGO method achieves invariance under reparametrization of the search space XX, under a change of parameters of the probability distributions, and under increasing transformations of the objective function. Theory strongly suggests that IGO algorithms have minimal loss in diversity during optimization, provided the initial diversity is high. First experiments using restricted Boltzmann machines confirm this insight. Thus IGO seems to provide, from information theory, an elegant way to spontaneously explore several valleys of a fitness landscape in a single run.Comment: Final published versio

    Life as an Explanation of the Measurement Problem

    Full text link
    No consensus regarding the universal validity of any particular interpretation of the measurement problem has been reached so far. The problem manifests strongly in various Wigner's-friend-type experiments where different observers experience different realities measuring the same quantum system. But only classical information obeys the second law of thermodynamics and can be perceived solely at the holographic screen of the closed orientable two-dimensional manifold implied by Verlinde's and Landauer's mass-information equivalence equations. I conjecture that biological cell, as a dissipative structure, is the smallest agent capable of processing quantum information through its holographic screen and that this mechanism have been extended by natural evolution to endo- and exosemiosis in multicellular organisms, and further to language of Homo sapiens. Any external stimuli must be measured and classified by the cell in the context of classical information to provide it with an evolutionary gain. Quantum information contained in a pure quantum state cannot be classified, while incoherent mixtures of non-orthogonal quantum states are only partially classifiable. The concept of an unobservable velocity, normal to the holographic screen is introduced. It is shown that it enables to derive the Unruh acceleration as acting normal to the screen, as well as to conveniently relate de Broglie and Compton wavelengths. It follows that the perceived universe, is induced by the set of Pythagorean triples, while all its measurable features, including perceived dimensionality, are set to maximise informational diversity.Comment: This research is incomplete and partially incorrec

    Meta-heuristic algorithms in car engine design: a literature survey

    Get PDF
    Meta-heuristic algorithms are often inspired by natural phenomena, including the evolution of species in Darwinian natural selection theory, ant behaviors in biology, flock behaviors of some birds, and annealing in metallurgy. Due to their great potential in solving difficult optimization problems, meta-heuristic algorithms have found their way into automobile engine design. There are different optimization problems arising in different areas of car engine management including calibration, control system, fault diagnosis, and modeling. In this paper we review the state-of-the-art applications of different meta-heuristic algorithms in engine management systems. The review covers a wide range of research, including the application of meta-heuristic algorithms in engine calibration, optimizing engine control systems, engine fault diagnosis, and optimizing different parts of engines and modeling. The meta-heuristic algorithms reviewed in this paper include evolutionary algorithms, evolution strategy, evolutionary programming, genetic programming, differential evolution, estimation of distribution algorithm, ant colony optimization, particle swarm optimization, memetic algorithms, and artificial immune system
    • 

    corecore