30 research outputs found

    Non-separability of Physical Systems as a Foundation of Consciousness

    Full text link
    A hypothesis is presented that non-separability of degrees of freedom is the fundamental property underlying consciousness in physical systems. The amount of consciousness in a system is determined by the extent of non-separability and the number of degrees of freedom involved. Non-interacting and feedforward systems have zero consciousness, whereas most systems of interacting particles appear to have low non-separability and consciousness. By contrast, brain circuits exhibit high complexity and weak but tightly coordinated interactions, which appear to support high non-separability and therefore high amount of consciousness. The hypothesis applies to both classical and quantum cases, and we highlight the formalism employing the Wigner function (which in the classical limit becomes the Liouville density function) as a potentially fruitful framework for characterizing non-separability and, thus, the amount of consciousness in a system. The hypothesis appears to be consistent with both the Integrated Information Theory and the Orchestrated Objective Reduction Theory and may help reconcile the two. It offers a natural explanation for the physical properties underlying the amount of consciousness and points to methods of estimating the amount of non-separability as promising ways of characterizing the amount of consciousness

    The Case for Quantum State Realism

    Get PDF
    I argue for a realist interpretation of the quantum state. I begin by reviewing and critically evaluating two arguments for an antirealist interpretation of the quantum state, the first derived from the so-called ‘measurement problem’, and the second from the concept of local causality. I argue that existing antirealist interpretations do not solve the measurement problem. Furthermore, I argue that it is possible to construct a local, realist interpretation of quantum mechanics, using methods borrowed from quantum field theory and based on John S. Bell’s concept of ‘local beables’. If the quantum state is interpreted subjectively, then the probabilities it associates with experimental outcomes are themselves subjective. I address the prospects for developing a subjective Bayesian interpretation of quantum mechanical probabilities based on the Quantum de Finetti Representation Theorem. Epistemic interpretations of the quantum state can be divided into those that are epistemic with respect to underlying ontic states, and those that are epistemic with respect to measurement outcomes. The Pusey Barrett and Rudolph (PBR) theorem places serious constraints on the former family of interpretations. I identify an important explanatory gap in the latter sort of interpretation. In particular, if the quantum state is a subjective representation of beliefs about future experimental outcomes, then it is not clear why those experimenters who use quantum mechanics should be better able to negotiate the world than those who do not. I then turn to the task of articulating a positive argument for the thesis of quantum state realism. I begin by articulating a minimal set of conditions that any realist interpretation must meet. One assumption built into the PBR result is that systems prepared in a given quantum state have a well-defined set of physical properties, which may be completely or incompletely described by the quantum state. Antirealist interpretations that reject this assumption are therefore compatible with the PBR result. A compelling case for quantum state realism must therefore be made on more general grounds. I consider two concrete examples of phenomena described by quantum mechanics that strongly suggest that the quantum state is genuinely representational in character

    On the Relative Nature of Quantum Individuals

    Get PDF
    In this work we argue against the interpretation that underlies the “Standard” account of Quantum Mechanics (SQM) that was established during the 1930s by Niels Bohr and Paul Dirac. Ever since, following this orthodox narrative, physicists have dogmatically proclaimed –quite regardless of the deep contradictions and problems– that the the theory of quanta describes a microscopic realm composed of elementary particles (such as electrons, protons and neutrons) which underly our macroscopic world composed of tables, chairs and dogs. After critically addressing this atomist dogma still present today in contemporary (quantum) physics and philosophy, we present a new understanding of quantum individuals defined as the minimum set of relations within a specific degree of complexity capable to account for all relations within that same degree. In this case, quantum individuality is not conceived in absolute terms but –instead– as an objectively relative concept which even though depends of the choice of bases and factorizations remain nonetheless part of the same invariant representation

    Explanations from contemporary quantum theories: some ontological characteristics

    Get PDF
    Početna je pozicija ovoga rada da je znanstveno znanje nepotpuno bez objašnjenja, gdje se razvojem novih teorija to znanje proširuje i produbljuje (tako što temeljne teorije objašnjavaju širi spektar pojava i postaju opdenitije primjenjive). Povijesno gledano, kvantna je teorija (početkom 20. st.), ili inicijalno 'kvantna mehanika', konačno potkopala navodni neograničeni uspjeh redukcionističke mehanistične eksplanatorne filozofije (uzevši u obzir i konceptualni dodatak koji čini Maxwellova konceptualizacija polja). Tako je ponovo otvoren put skepticizmu prema eksplanatornim ciljevima znanosti. Međutim, u posljednje vrijeme ponovo se budi i vjerovanje u temeljnu ulogu kvantne teorije, bilo kao dio fundamentalne potpune teorije ili iznova osmišljenu u okvirima ograničenja na prikupljanje informacija o osjetilno nedostupnoj ontologiji fizikalnog svijeta.The starting position of the this thesis is that scientific knowledge is incomplete without explanations, whereupon with the development of new theories our knowledge both broadens and deepens (as fundamental theories explain more and become more general). Historically, it has been quantum theory (early 20th century), or initially quantum mechanics, that finally undermined the supposed runaway success of reductionist mechanistic philosophy (modulo Maxwellian updating), re-opening the door for scepticism about the explanatory aims of science. However, recent years have seen a revival of the belief in some version of quantum theory, either as part of a fundamental complete theory or as reinvented in terms of constraints on information gathering about the underlying unobservable ontology of the physical world

    Entanglement and Information in Algebraic Quantum Theories

    Get PDF
    The algebraic approach to physical theories provides a general framework encompassing both classical and quantum mechanics. Accordingly, by looking at the behaviour of the relevant algebras of observables one can investigate structural and conceptual differences between the theories. Interesting foundational questions can be formulated algebraically and their answers are then given in a mathematically compelling way. My dissertation focuses on some philosophical issues concerning entanglement and quantum information as they arise in the algebraic approach. The se two concepts are connected in that one can exploit the non-local character of quantum theory to construct protocols of information theory which are not realized in the classical world. I first introduce the basic concepts of the algebraic formalism by reviewing von Neumann's work on the mathematical foundations of quantum theories. After presenting the reasons why von Neumann abandoned the standard Hilbert space formulation in favour of the algebraic approach, I show how his axiomatic program remained a mathematical "utopia" in mathematical physics. The Bayesian interpretation of quantum mechanics is grounded in information-theoretical considerations. I take on some specific problems concerning the extension of Bayesian statistical inference in infinite dimensional Hilbert space. I demonstrate that the failure of a stability condition, formulated as a rationality constraint for classical Bayesian conditional probabilities, does not undermine the Bayesian interpretation of quantum probabilities. I then provide a solution to the problem of Bayesian noncommutative statistical inference in general probability spaces. Furthermore, I propose a derivation of the a priori probability state in quantum mechanics from symmetry considerations. Finally, Algebraic Quantum Field Theory offers a rigorous axiomatization of quantum field theory, namely the synthesis of quantum mechanics and special relativity. In such a framework one can raise the question of whether or not quantum correlations are made stronger by adding relativistic constraints. I argue that entanglement is more robust in the relativistic context than in ordinary quantum theory. In particular, I show how to generalize the claim that entanglement across space-like separated regions of Minkowski spacetime would persist, no matter how one acts locally

    Entanglement in Many-Body Systems

    Get PDF
    The recent interest in aspects common to quantum information and condensed matter has prompted a prosperous activity at the border of these disciplines that were far distant until few years ago. Numerous interesting questions have been addressed so far. Here we review an important part of this field, the properties of the entanglement in many-body systems. We discuss the zero and finite temperature properties of entanglement in interacting spin, fermionic and bosonic model systems. Both bipartite and multipartite entanglement will be considered. At equilibrium we emphasize on how entanglement is connected to the phase diagram of the underlying model. The behavior of entanglement can be related, via certain witnesses, to thermodynamic quantities thus offering interesting possibilities for an experimental test. Out of equilibrium we discuss how to generate and manipulate entangled states by means of many-body Hamiltonians.Comment: 61 pages, 29 figure

    Non-Gaussianity in CMB analysis: bispectrum estimation and foreground subtraction

    Get PDF
    The focus of this work is the development of statistical and numerical methods forthe study of non-Gaussian and/or anisotropic features in cosmological surveys of themicrowave sky. We focus on two very different types of non-Gaussian (NG) signals. The former is primordial non-Gaussianity (PNG), generated in the very Early Universeduring the inflationary expansion stage. In this case the aim of our study will be that ofexploiting the NG component in order to extract useful cosmological information. The latter is non-Gaussianity generated by astrophysical foreground contamination. In thiscase, the goal is instead that of using non-Gaussianity as a tool to help in removingthese spurious, non-cosmological components (of course foregrounds themselves contain relevant astrophysical information, but the focus in this thesis is on Cosmology, thereforeforegrounds are regarded here only as a contaminant). Considerable efforts have been put so far in the search for deviations from Gaussianity in the CMB anisotropies, that are expected to provide invaluable information aboutthe Inflationary epoch. Inflation is in fact expected to produce an isotropic and nearly-Gaussian fluctuation field. However, a large amount of models also predicts very small,but highly model dependent NG signatures. This is the main reason behind the largeinterest in primordial NG studies. Of course, the pursuit for primordial non-Gaussianity must rely on beyond power spectrum statistics. It turns out that the most important higher order correlator produced by interactions during Inflation is the three pointfunction, or, more precisely, its Fourier space counterpart, called the bispectrum. Toovercome the issue of computing the full bispectrum of the observed field, that would require a prohibitive amount of computational time, the search for PNG features is carriedout by fitting theoretically motivated bispectrum templates to the data. Among those, one can find bispectrum templates with a scale-dependent (SD) bispectrum amplitude. Such templates have actually received little attention so far in the literature, especiallyas long as NG statistical estimation and data analysis are concerned. This is why a significant part of this thesis will be devoted to the development and application of efficientstatistical pipelines for CMB scale-dependent bispectra estimation. We present here theresults of the estimation of several primordial running bispectra obtained from WMAP9 year and Planck data-set. iiiThe second part of this thesis deals instead, as mentioned iin the beginning, withthe component separation problem, i.e. the identification of the different sources thatcontributes to the microwave sky brightness. Foreground emission produces several,potentially large, non-Gaussian signatures that can in principle be used to identify andremove the spurious components from the microwave sky maps. Our focus will be onthe development of a foreground cleaning technique relying on the hypothesis that, ifthe data are represented in a proper basis, the foreground signal is sparse. Sparsenessimplies that the majority of the signal is concentrated in few basis elements, that can be used to fit the corresponding component with a thresholding algorithm. We verifythat the spherical needlet frame has the right properties to disentangle the coherentforeground emission from the isotropic stochastic CMB signal. We will make clear inthe following how sparseness in needlet space is actually in several ways linked to thecoherence, anisotropy and non-Gaussianity of the foreground components.. The mainadvantages of our needlet thresholding technique are that it does not requires multi-frequency information as well as that it can be used in combination with other methods. Therefore it can represent a valuable tool in experiments with limited frequency coverage,as current ground-based CMB surveys
    corecore