33,592 research outputs found

    Infinite Correlation in Measured Quantum Processes

    Get PDF
    We show that quantum dynamical systems can exhibit infinite correlations in their behavior when repeatedly measured. We model quantum processes using quantum finite-state generators and take the stochastic language they generate as a representation of their behavior. We analyze two spin-1 quantum systems that differ only in how they are observed. The corresponding language generated has short-range correlation in one case and infinite correlation in the other.Comment: 2 pages, 2 figure

    Beyond Desartes and Newton: Recovering life and humanity

    Get PDF
    Attempts to ‘naturalize’ phenomenology challenge both traditional phenomenology and traditional approaches to cognitive science. They challenge Edmund Husserl’s rejection of naturalism and his attempt to establish phenomenology as a foundational transcendental discipline, and they challenge efforts to explain cognition through mainstream science. While appearing to be a retreat from the bold claims made for phenomenology, it is really its triumph. Naturalized phenomenology is spearheading a successful challenge to the heritage of Cartesian dualism. This converges with the reaction against Cartesian thought within science itself. Descartes divided the universe between res cogitans, thinking substances, and res extensa, the mechanical world. The latter won with Newton and we have, in most of objective science since, literally lost our mind, hence our humanity. Despite Darwin, biologists remain children of Newton, and dream of a grand theory that is epistemologically complete and would allow lawful entailment of the evolution of the biosphere. This dream is no longer tenable. We now have to recognize that science and scientists are within and part of the world we are striving to comprehend, as proponents of endophysics have argued, and that physics, biology and mathematics have to be reconceived accordingly. Interpreting quantum mechanics from this perspective is shown to both illuminate conscious experience and reveal new paths for its further development. In biology we must now justify the use of the word “function”. As we shall see, we cannot prestate the ever new biological functions that arise and constitute the very phase space of evolution. Hence, we cannot mathematize the detailed becoming of the biosphere, nor write differential equations for functional variables we do not know ahead of time, nor integrate those equations, so no laws “entail” evolution. The dream of a grand theory fails. In place of entailing laws, a post-entailing law explanatory framework is proposed in which Actuals arise in evolution that constitute new boundary conditions that are enabling constraints that create new, typically unprestatable, Adjacent Possible opportunities for further evolution, in which new Actuals arise, in a persistent becoming. Evolution flows into a typically unprestatable succession of Adjacent Possibles. Given the concept of function, the concept of functional closure of an organism making a living in its world, becomes central. Implications for patterns in evolution include historical reconstruction, and statistical laws such as the distribution of extinction events, or species per genus, and the use of formal cause, not efficient cause, laws

    Differentiation with stratification: a principle of theoretical physics in the tradition of the memory art

    Full text link
    The Art of Memory started with Aristotle's questions on memory. During its long evolution, it had important contributions from alchemist, was transformed by Ramon Llull and apparently ended with Giordano Bruno, who was considered the best known representative of this art. This tradition did not disappear, but lives in the formulations of our modern scientific theories. From its initial form as a method of keeping information via associations, it became a principle of classification and structuring of knowledge. This principle, which we here name {\it differentiation with stratification}, is a structural design behind classical mechanics. Integrating two different traditions of science in one structure, this physical theory became the modern paradigm of science. In this paper, we show that this principle can also be formulated as a set of questions. This is done via an analysis of theories, based on the epistemology of observational realism. A combination of Rudolph Carnap's concept of theory as a system of observational and theoretical languages, with a criterion for separating observational languages, based on analytical psychology, shapes this epistemology. The `nuclear' role of the observational laws and the differentiations from these nucleus, reproducing the general cases of phenomena, reveals the memory art's heritage in the theories. Here in this paper we argue that this design is also present in special relativity and in quantum mechanics.Comment: 6 pages, no figures; "Quantum theory from Problems to Advances", June 9-12, 2014, Linnaeus University, Vaxjo, Swede

    Computation in Finitary Stochastic and Quantum Processes

    Full text link
    We introduce stochastic and quantum finite-state transducers as computation-theoretic models of classical stochastic and quantum finitary processes. Formal process languages, representing the distribution over a process's behaviors, are recognized and generated by suitable specializations. We characterize and compare deterministic and nondeterministic versions, summarizing their relative computational power in a hierarchy of finitary process languages. Quantum finite-state transducers and generators are a first step toward a computation-theoretic analysis of individual, repeatedly measured quantum dynamical systems. They are explored via several physical systems, including an iterated beam splitter, an atom in a magnetic field, and atoms in an ion trap--a special case of which implements the Deutsch quantum algorithm. We show that these systems' behaviors, and so their information processing capacity, depends sensitively on the measurement protocol.Comment: 25 pages, 16 figures, 1 table; http://cse.ucdavis.edu/~cmg; numerous corrections and update

    Using the quantum probability ranking principle to rank interdependent documents

    Get PDF
    A known limitation of the Probability Ranking Principle (PRP) is that it does not cater for dependence between documents. Recently, the Quantum Probability Ranking Principle (QPRP) has been proposed, which implicitly captures dependencies between documents through “quantum interference”. This paper explores whether this new ranking principle leads to improved performance for subtopic retrieval, where novelty and diversity is required. In a thorough empirical investigation, models based on the PRP, as well as other recently proposed ranking strategies for subtopic retrieval (i.e. Maximal Marginal Relevance (MMR) and Portfolio Theory(PT)), are compared against the QPRP. On the given task, it is shown that the QPRP outperforms these other ranking strategies. And unlike MMR and PT, one of the main advantages of the QPRP is that no parameter estimation/tuning is required; making the QPRP both simple and effective. This research demonstrates that the application of quantum theory to problems within information retrieval can lead to significant improvements

    Quantum effects in linguistic endeavors

    Full text link
    Classifying the information content of neural spike trains in a linguistic endeavor, an uncertainty relation emerges between the bit size of a word and its duration. This uncertainty is associated with the task of synchronizing the spike trains of different duration representing different words. The uncertainty involves peculiar quantum features, so that word comparison amounts to measurement-based-quantum computation. Such a quantum behavior explains the onset and decay of the memory window connecting successive pieces of a linguistic text. The behavior here discussed is applicable to other reported evidences of quantum effects in human linguistic processes, so far lacking a plausible framework, since either no efforts to assign an appropriate quantum constant had been associated or speculating on microscopic processes dependent on Planck's constant resulted in unrealistic decoherence times
    • 

    corecore