43,992 research outputs found

    Complexity Through Nonextensivity

    Full text link
    The problem of defining and studying complexity of a time series has interested people for years. In the context of dynamical systems, Grassberger has suggested that a slow approach of the entropy to its extensive asymptotic limit is a sign of complexity. We investigate this idea further by information theoretic and statistical mechanics techniques and show that these arguments can be made precise, and that they generalize many previous approaches to complexity, in particular unifying ideas from the physics literature with ideas from learning and coding theory; there are even connections of this statistical approach to algorithmic or Kolmogorov complexity. Moreover, a set of simple axioms similar to those used by Shannon in his development of information theory allows us to prove that the divergent part of the subextensive component of the entropy is a unique complexity measure. We classify time series by their complexities and demonstrate that beyond the `logarithmic' complexity classes widely anticipated in the literature there are qualitatively more complex, `power--law' classes which deserve more attention.Comment: summarizes and extends physics/000707

    Measuring sets in infinite groups

    Full text link
    We are now witnessing a rapid growth of a new part of group theory which has become known as "statistical group theory". A typical result in this area would say something like ``a random element (or a tuple of elements) of a group G has a property P with probability p". The validity of a statement like that does, of course, heavily depend on how one defines probability on groups, or, equivalently, how one measures sets in a group (in particular, in a free group). We hope that new approaches to defining probabilities on groups outlined in this paper create, among other things, an appropriate framework for the study of the "average case" complexity of algorithms on groups.Comment: 22 page

    A Short Introduction to Model Selection, Kolmogorov Complexity and Minimum Description Length (MDL)

    Full text link
    The concept of overfitting in model selection is explained and demonstrated with an example. After providing some background information on information theory and Kolmogorov complexity, we provide a short explanation of Minimum Description Length and error minimization. We conclude with a discussion of the typical features of overfitting in model selection.Comment: 20 pages, Chapter 1 of The Paradox of Overfitting, Master's thesis, Rijksuniversiteit Groningen, 200

    On the Numerical Study of the Complexity and Fractal Dimension of CMB Anisotropies

    Get PDF
    We consider the problem of numerical computation of the Kolmogorov complexity and the fractal dimension of the anisotropy spots of Cosmic Microwave Background (CMB) radiation. Namely, we describe an algorithm of estimation of the complexity of spots given by certain pixel configuration on a grid and represent the results of computations for a series of structures of different complexity. Thus, we demonstrate the calculability of such an abstract descriptor as the Kolmogorov complexity for CMB digitized maps. The correlation of complexity of the anisotropy spots with their fractal dimension is revealed as well. This technique can be especially important while analyzing the data of the forthcoming space experiments.Comment: LATEX, 3 figure

    The Thermodynamics of Network Coding, and an Algorithmic Refinement of the Principle of Maximum Entropy

    Full text link
    The principle of maximum entropy (Maxent) is often used to obtain prior probability distributions as a method to obtain a Gibbs measure under some restriction giving the probability that a system will be in a certain state compared to the rest of the elements in the distribution. Because classical entropy-based Maxent collapses cases confounding all distinct degrees of randomness and pseudo-randomness, here we take into consideration the generative mechanism of the systems considered in the ensemble to separate objects that may comply with the principle under some restriction and whose entropy is maximal but may be generated recursively from those that are actually algorithmically random offering a refinement to classical Maxent. We take advantage of a causal algorithmic calculus to derive a thermodynamic-like result based on how difficult it is to reprogram a computer code. Using the distinction between computable and algorithmic randomness we quantify the cost in information loss associated with reprogramming. To illustrate this we apply the algorithmic refinement to Maxent on graphs and introduce a Maximal Algorithmic Randomness Preferential Attachment (MARPA) Algorithm, a generalisation over previous approaches. We discuss practical implications of evaluation of network randomness. Our analysis provides insight in that the reprogrammability asymmetry appears to originate from a non-monotonic relationship to algorithmic probability. Our analysis motivates further analysis of the origin and consequences of the aforementioned asymmetries, reprogrammability, and computation.Comment: 30 page

    From Knowledge, Knowability and the Search for Objective Randomness to a New Vision of Complexity

    Full text link
    Herein we consider various concepts of entropy as measures of the complexity of phenomena and in so doing encounter a fundamental problem in physics that affects how we understand the nature of reality. In essence the difficulty has to do with our understanding of randomness, irreversibility and unpredictability using physical theory, and these in turn undermine our certainty regarding what we can and what we cannot know about complex phenomena in general. The sources of complexity examined herein appear to be channels for the amplification of naturally occurring randomness in the physical world. Our analysis suggests that when the conditions for the renormalization group apply, this spontaneous randomness, which is not a reflection of our limited knowledge, but a genuine property of nature, does not realize the conventional thermodynamic state, and a new condition, intermediate between the dynamic and the thermodynamic state, emerges. We argue that with this vision of complexity, life, which with ordinary statistical mechanics seems to be foreign to physics, becomes a natural consequence of dynamical processes.Comment: Phylosophica
    • …
    corecore