394,112 research outputs found

    A single-shot measurement of the energy of product states in a translation invariant spin chain can replace any quantum computation

    Get PDF
    In measurement-based quantum computation, quantum algorithms are implemented via sequences of measurements. We describe a translationally invariant finite-range interaction on a one-dimensional qudit chain and prove that a single-shot measurement of the energy of an appropriate computational basis state with respect to this Hamiltonian provides the output of any quantum circuit. The required measurement accuracy scales inverse polynomially with the size of the simulated quantum circuit. This shows that the implementation of energy measurements on generic qudit chains is as hard as the realization of quantum computation. Here a ''measurement'' is any procedure that samples from the spectral measure induced by the observable and the state under consideration. As opposed to measurement-based quantum computation, the post-measurement state is irrelevant.Comment: 19 pages, transition rules for the CA correcte

    Complexity, parallel computation and statistical physics

    Full text link
    The intuition that a long history is required for the emergence of complexity in natural systems is formalized using the notion of depth. The depth of a system is defined in terms of the number of parallel computational steps needed to simulate it. Depth provides an objective, irreducible measure of history applicable to systems of the kind studied in statistical physics. It is argued that physical complexity cannot occur in the absence of substantial depth and that depth is a useful proxy for physical complexity. The ideas are illustrated for a variety of systems in statistical physics.Comment: 21 pages, 7 figure

    A Complexity Measure Based on Cognitive Weights

    Get PDF
    Cognitive Informatics plays an important role in understanding the fundamental characteristics of software. This paper proposes a model of the fundamental characteristics of software, complexity in terms of cognitive weights of basic control structures. Cognitive weights are degree of difficulty or relative time and effort required for comprehending a given piece of software, which satisfy the definition of complexity. An attempt has also been made to prove the robustness of proposed complexity measure by comparing it with the other measures based on cognitive informatics

    Primordial Evolution in the Finitary Process Soup

    Full text link
    A general and basic model of primordial evolution--a soup of reacting finitary and discrete processes--is employed to identify and analyze fundamental mechanisms that generate and maintain complex structures in prebiotic systems. The processes--ϵ\epsilon-machines as defined in computational mechanics--and their interaction networks both provide well defined notions of structure. This enables us to quantitatively demonstrate hierarchical self-organization in the soup in terms of complexity. We found that replicating processes evolve the strategy of successively building higher levels of organization by autocatalysis. Moreover, this is facilitated by local components that have low structural complexity, but high generality. In effect, the finitary process soup spontaneously evolves a selection pressure that favors such components. In light of the finitary process soup's generality, these results suggest a fundamental law of hierarchical systems: global complexity requires local simplicity.Comment: 7 pages, 10 figures; http://cse.ucdavis.edu/~cmg/compmech/pubs/pefps.ht
    corecore