120,871 research outputs found

    Aspects of Statistical Physics in Computational Complexity

    Full text link
    The aim of this review paper is to give a panoramic of the impact of spin glass theory and statistical physics in the study of the K-sat problem. The introduction of spin glass theory in the study of the random K-sat problem has indeed left a mark on the field, leading to some groundbreaking descriptions of the geometry of its solution space, and helping to shed light on why it seems to be so hard to solve. Most of the geometrical intuitions have their roots in the Sherrington-Kirkpatrick model of spin glass. We'll start Chapter 2 by introducing the model from a mathematical perspective, presenting a selection of rigorous results and giving a first intuition about the cavity method. We'll then switch to a physical perspective, to explore concepts like pure states, hierarchical clustering and replica symmetry breaking. Chapter 3 will be devoted to the spin glass formulation of K-sat, while the most important phase transitions of K-sat (clustering, condensation, freezing and SAT/UNSAT) will be extensively discussed in Chapter 4, with respect their complexity, free-entropy density and the Parisi 1RSB parameter. The concept of algorithmic barrier will be presented in Chapter 5 and exemplified in detail on the Belief Propagation (BP) algorithm. The BP algorithm will be introduced and motivated, and numerical analysis of a BP-guided decimation algorithm will be used to show the role of the clustering, condensation and freezing phase transitions in creating an algorithmic barrier for BP. Taking from the failure of BP in the clustered and condensed phases, Chapter 6 will finally introduce the Cavity Method to deal with the shattering of the solution space, and present its application to the development of the Survey Propagation algorithm.Comment: 56 pages, 14 figure

    What Is a Macrostate? Subjective Observations and Objective Dynamics

    Get PDF
    We consider the question of whether thermodynamic macrostates are objective consequences of dynamics, or subjective reflections of our ignorance of a physical system. We argue that they are both; more specifically, that the set of macrostates forms the unique maximal partition of phase space which 1) is consistent with our observations (a subjective fact about our ability to observe the system) and 2) obeys a Markov process (an objective fact about the system's dynamics). We review the ideas of computational mechanics, an information-theoretic method for finding optimal causal models of stochastic processes, and argue that macrostates coincide with the ``causal states'' of computational mechanics. Defining a set of macrostates thus consists of an inductive process where we start with a given set of observables, and then refine our partition of phase space until we reach a set of states which predict their own future, i.e. which are Markovian. Macrostates arrived at in this way are provably optimal statistical predictors of the future values of our observables.Comment: 15 pages, no figure

    Complexity, parallel computation and statistical physics

    Full text link
    The intuition that a long history is required for the emergence of complexity in natural systems is formalized using the notion of depth. The depth of a system is defined in terms of the number of parallel computational steps needed to simulate it. Depth provides an objective, irreducible measure of history applicable to systems of the kind studied in statistical physics. It is argued that physical complexity cannot occur in the absence of substantial depth and that depth is a useful proxy for physical complexity. The ideas are illustrated for a variety of systems in statistical physics.Comment: 21 pages, 7 figure

    Some Computational Aspects of Essential Properties of Evolution and Life

    Get PDF
    While evolution has inspired algorithmic methods of heuristic optimisation, little has been done in the way of using concepts of computation to advance our understanding of salient aspects of biological evolution. We argue that under reasonable assumptions, interesting conclusions can be drawn that are of relevance to behavioural evolution. We will focus on two important features of life--robustness and fitness optimisation--which, we will argue, are related to algorithmic probability and to the thermodynamics of computation, subjects that may be capable of explaining and modelling key features of living organisms, and which can be used in understanding and formulating algorithms of evolutionary computation

    Information Causality, the Tsirelson Bound, and the 'Being-Thus' of Things

    Get PDF
    The principle of `information causality' can be used to derive an upper bound---known as the `Tsirelson bound'---on the strength of quantum mechanical correlations, and has been conjectured to be a foundational principle of nature. To date, however, it has not been sufficiently motivated to play such a foundational role. The motivations that have so far been given are, as I argue, either unsatisfactorily vague or appeal to little if anything more than intuition. Thus in this paper I consider whether some way might be found to successfully motivate the principle. And I propose that a compelling way of so doing is to understand it as a generalisation of Einstein's principle of the mutually independent existence---the `being-thus'---of spatially distant things. In particular I first describe an argument, due to Demopoulos, to the effect that the so-called `no-signalling' condition can be viewed as a generalisation of Einstein's principle that is appropriate for an irreducibly statistical theory such as quantum mechanics. I then argue that a compelling way to motivate information causality is to in turn consider it as a further generalisation of the Einsteinian principle that is appropriate for a theory of communication. I describe, however, some important conceptual obstacles that must yet be overcome if the project of establishing information causality as a foundational principle of nature is to succeed.Comment: '*' footnote added to page 1; 24 pages, 1 figure; Forthcoming in Studies in History and Philosophy of Modern Physic

    Counting Steps: A Finitist Approach to Objective Probability in Physics

    Get PDF
    We propose a new interpretation of objective probability in statistical physics based on physical computational complexity. This notion applies to a single physical system (be it an experimental set-up in the lab, or a subsystem of the universe), and quantifies (1) the difficulty to realize a physical state given another, (2) the 'distance' (in terms of physical resources) between a physical state and another, and (3) the size of the set of time-complexity functions that are compatible with the physical resources required to reach a physical state from another. This view (a) exorcises 'ignorance' from statistical physics, and (b) underlies a new interpretation to non-relativistic quantum mechanics

    Complex Networks from Classical to Quantum

    Full text link
    Recent progress in applying complex network theory to problems in quantum information has resulted in a beneficial crossover. Complex network methods have successfully been applied to transport and entanglement models while information physics is setting the stage for a theory of complex systems with quantum information-inspired methods. Novel quantum induced effects have been predicted in random graphs---where edges represent entangled links---and quantum computer algorithms have been proposed to offer enhancement for several network problems. Here we review the results at the cutting edge, pinpointing the similarities and the differences found at the intersection of these two fields.Comment: 12 pages, 4 figures, REVTeX 4-1, accepted versio

    Fractals in the Nervous System: conceptual Implications for Theoretical Neuroscience

    Get PDF
    This essay is presented with two principal objectives in mind: first, to document the prevalence of fractals at all levels of the nervous system, giving credence to the notion of their functional relevance; and second, to draw attention to the as yet still unresolved issues of the detailed relationships among power law scaling, self-similarity, and self-organized criticality. As regards criticality, I will document that it has become a pivotal reference point in Neurodynamics. Furthermore, I will emphasize the not yet fully appreciated significance of allometric control processes. For dynamic fractals, I will assemble reasons for attributing to them the capacity to adapt task execution to contextual changes across a range of scales. The final Section consists of general reflections on the implications of the reviewed data, and identifies what appear to be issues of fundamental importance for future research in the rapidly evolving topic of this review
    • …
    corecore