56,091 research outputs found

    Design of the Artificial: lessons from the biological roots of general intelligence

    Full text link
    Our desire and fascination with intelligent machines dates back to the antiquity's mythical automaton Talos, Aristotle's mode of mechanical thought (syllogism) and Heron of Alexandria's mechanical machines and automata. However, the quest for Artificial General Intelligence (AGI) is troubled with repeated failures of strategies and approaches throughout the history. This decade has seen a shift in interest towards bio-inspired software and hardware, with the assumption that such mimicry entails intelligence. Though these steps are fruitful in certain directions and have advanced automation, their singular design focus renders them highly inefficient in achieving AGI. Which set of requirements have to be met in the design of AGI? What are the limits in the design of the artificial? Here, a careful examination of computation in biological systems hints that evolutionary tinkering of contextual processing of information enabled by a hierarchical architecture is the key to build AGI.Comment: Theoretical perspective on AGI (Artificial General Intelligence

    On the origin of ambiguity in efficient communication

    Full text link
    This article studies the emergence of ambiguity in communication through the concept of logical irreversibility and within the framework of Shannon's information theory. This leads us to a precise and general expression of the intuition behind Zipf's vocabulary balance in terms of a symmetry equation between the complexities of the coding and the decoding processes that imposes an unavoidable amount of logical uncertainty in natural communication. Accordingly, the emergence of irreversible computations is required if the complexities of the coding and the decoding processes are balanced in a symmetric scenario, which means that the emergence of ambiguous codes is a necessary condition for natural communication to succeed.Comment: 28 pages, 2 figure

    Complexity, parallel computation and statistical physics

    Full text link
    The intuition that a long history is required for the emergence of complexity in natural systems is formalized using the notion of depth. The depth of a system is defined in terms of the number of parallel computational steps needed to simulate it. Depth provides an objective, irreducible measure of history applicable to systems of the kind studied in statistical physics. It is argued that physical complexity cannot occur in the absence of substantial depth and that depth is a useful proxy for physical complexity. The ideas are illustrated for a variety of systems in statistical physics.Comment: 21 pages, 7 figure

    What Makes a Computation Unconventional?

    Full text link
    A coherent mathematical overview of computation and its generalisations is described. This conceptual framework is sufficient to comfortably host a wide range of contemporary thinking on embodied computation and its models.Comment: Based on an invited lecture for the 'Symposium on Natural/Unconventional Computing and Its Philosophical Significance' at the AISB/IACAP World Congress 2012, University of Birmingham, July 2-6, 201
    corecore