52,794 research outputs found

    Algorithmic Complexity for Short Binary Strings Applied to Psychology: A Primer

    Full text link
    Since human randomness production has been studied and widely used to assess executive functions (especially inhibition), many measures have been suggested to assess the degree to which a sequence is random-like. However, each of them focuses on one feature of randomness, leading authors to have to use multiple measures. Here we describe and advocate for the use of the accepted universal measure for randomness based on algorithmic complexity, by means of a novel previously presented technique using the the definition of algorithmic probability. A re-analysis of the classical Radio Zenith data in the light of the proposed measure and methodology is provided as a study case of an application.Comment: To appear in Behavior Research Method

    Complexity of multi-dimensional spontaneous EEG decreases during propofol induced general anaesthesia

    Get PDF
    Emerging neural theories of consciousness suggest a correlation between a specific type of neural dynamical complexity and the level of consciousness: When awake and aware, causal interactions between brain regions are both integrated (all regions are to a certain extent connected) and differentiated (there is inhomogeneity and variety in the interactions). In support of this, recent work by Casali et al (2013) has shown that Lempel-Ziv complexity correlates strongly with conscious level, when computed on the EEG response to transcranial magnetic stimulation. Here we investigated complexity of spontaneous high-density EEG data during propofol-induced general anaesthesia. We consider three distinct measures: (i) Lempel-Ziv complexity, which is derived from how compressible the data are; (ii) amplitude coalition entropy, which measures the variability in the constitution of the set of active channels; and (iii) the novel synchrony coalition entropy (SCE), which measures the variability in the constitution of the set of synchronous channels. After some simulations on Kuramoto oscillator models which demonstrate that these measures capture distinct ‘flavours’ of complexity, we show that there is a robustly measurable decrease in the complexity of spontaneous EEG during general anaesthesia

    From Knowledge, Knowability and the Search for Objective Randomness to a New Vision of Complexity

    Full text link
    Herein we consider various concepts of entropy as measures of the complexity of phenomena and in so doing encounter a fundamental problem in physics that affects how we understand the nature of reality. In essence the difficulty has to do with our understanding of randomness, irreversibility and unpredictability using physical theory, and these in turn undermine our certainty regarding what we can and what we cannot know about complex phenomena in general. The sources of complexity examined herein appear to be channels for the amplification of naturally occurring randomness in the physical world. Our analysis suggests that when the conditions for the renormalization group apply, this spontaneous randomness, which is not a reflection of our limited knowledge, but a genuine property of nature, does not realize the conventional thermodynamic state, and a new condition, intermediate between the dynamic and the thermodynamic state, emerges. We argue that with this vision of complexity, life, which with ordinary statistical mechanics seems to be foreign to physics, becomes a natural consequence of dynamical processes.Comment: Phylosophica

    Towards a common thread in Complexity: an accuracy-based approach

    Full text link
    The complexity of a system, in general, makes it difficult to determine some or almost all matrix elements of its operators. The lack of accuracy acts as a source of randomness for the matrix elements which are also subjected to an external potential due to existing system conditions. The fluctuation of accuracy due to varying system-conditions leads to a diffusion of the matrix elements. We show that, for the single well potentials, the diffusion can be described by a common mathematical formulation where system information enters through a single parameter. This further leads to a characterization of physical properties by an infinite range of single parametric universality classes

    Complexity, BioComplexity, the Connectionist Conjecture and Ontology of Complexity\ud

    Get PDF
    This paper develops and integrates major ideas and concepts on complexity and biocomplexity - the connectionist conjecture, universal ontology of complexity, irreducible complexity of totality & inherent randomness, perpetual evolution of information, emergence of criticality and equivalence of symmetry & complexity. This paper introduces the Connectionist Conjecture which states that the one and only representation of Totality is the connectionist one i.e. in terms of nodes and edges. This paper also introduces an idea of Universal Ontology of Complexity and develops concepts in that direction. The paper also develops ideas and concepts on the perpetual evolution of information, irreducibility and computability of totality, all in the context of the Connectionist Conjecture. The paper indicates that the control and communication are the prime functionals that are responsible for the symmetry and complexity of complex phenomenon. The paper takes the stand that the phenomenon of life (including its evolution) is probably the nearest to what we can describe with the term “complexity”. The paper also assumes that signaling and communication within the living world and of the living world with the environment creates the connectionist structure of the biocomplexity. With life and its evolution as the substrate, the paper develops ideas towards the ontology of complexity. The paper introduces new complexity theoretic interpretations of fundamental biomolecular parameters. The paper also develops ideas on the methodology to determine the complexity of “true” complex phenomena.\u

    Algorithmic Randomness as Foundation of Inductive Reasoning and Artificial Intelligence

    Full text link
    This article is a brief personal account of the past, present, and future of algorithmic randomness, emphasizing its role in inductive inference and artificial intelligence. It is written for a general audience interested in science and philosophy. Intuitively, randomness is a lack of order or predictability. If randomness is the opposite of determinism, then algorithmic randomness is the opposite of computability. Besides many other things, these concepts have been used to quantify Ockham's razor, solve the induction problem, and define intelligence.Comment: 9 LaTeX page
    corecore