963 research outputs found

    Informationally Complete Measurements and Optimal Representations of Quantum Theory

    Get PDF
    Minimal informationally complete quantum measurements (MICs) furnish probabilistic representations of quantum theory. These representations cleanly present the Born rule as an additional constraint in probabilistic decision theory, a perspective advanced by QBism. Because of this, their structure illuminates important ways in which quantum theory differs from classical physics. MICs have, however, so far received relatively little attention. In this dissertation, we investigate some of their general properties and relations to other topics in quantum information. A special type of MIC called a symmetric informationally complete measurement makes repeated appearances as the optimal or extremal solution in distinct settings, signifying they play a significant foundational role. Once the general structure of MICs is more fully explicated, we speculate that the representation will have unique advantages analogous to the phase space and path integral formulations. On the conceptual side, the reasons for QBism continue to grow. Most recently, extensions to the Wigner\u27s friend paradox have threatened the consistency of many interpretations. QBism\u27s resolution is uniquely simple and powerful, further strengthening the evidence for this interpretation

    A Defense of Pure Connectionism

    Full text link
    Connectionism is an approach to neural-networks-based cognitive modeling that encompasses the recent deep learning movement in artificial intelligence. It came of age in the 1980s, with its roots in cybernetics and earlier attempts to model the brain as a system of simple parallel processors. Connectionist models center on statistical inference within neural networks with empirically learnable parameters, which can be represented as graphical models. More recent approaches focus on learning and inference within hierarchical generative models. Contra influential and ongoing critiques, I argue in this dissertation that the connectionist approach to cognitive science possesses in principle (and, as is becoming increasingly clear, in practice) the resources to model even the most rich and distinctly human cognitive capacities, such as abstract, conceptual thought and natural language comprehension and production. Consonant with much previous philosophical work on connectionism, I argue that a core principle—that proximal representations in a vector space have similar semantic values—is the key to a successful connectionist account of the systematicity and productivity of thought, language, and other core cognitive phenomena. My work here differs from preceding work in philosophy in several respects: (1) I compare a wide variety of connectionist responses to the systematicity challenge and isolate two main strands that are both historically important and reflected in ongoing work today: (a) vector symbolic architectures and (b) (compositional) vector space semantic models; (2) I consider very recent applications of these approaches, including their deployment on large-scale machine learning tasks such as machine translation; (3) I argue, again on the basis mostly of recent developments, for a continuity in representation and processing across natural language, image processing and other domains; (4) I explicitly link broad, abstract features of connectionist representation to recent proposals in cognitive science similar in spirit, such as hierarchical Bayesian and free energy minimization approaches, and offer a single rebuttal of criticisms of these related paradigms; (5) I critique recent alternative proposals that argue for a hybrid Classical (i.e. serial symbolic)/statistical model of mind; (6) I argue that defending the most plausible form of a connectionist cognitive architecture requires rethinking certain distinctions that have figured prominently in the history of the philosophy of mind and language, such as that between word- and phrase-level semantic content, and between inference and association

    Optimizing quantum circuit layouts

    Get PDF
    Un dels problemes amb els quals s'enfronta la computació quàntica és el de l'optimització de la compilació d'un circuit quàntic. El procés de compilació inclou bàsicament dues etapes: síntesi del circuit a executar en termes de les portes quàntiques suportades pel processador, i adaptació del circuit a executar a les limitacions de connectivitat imposades pel processador. En aquest treball, he abordat el segon d'aquests problemes, conegut amb el nom de Quantum Circuit Layout (QCL). Per a la seva resolució, he intentat usar tècniques de Reinforcement Learning (RL), que requereixen modelitzar prèviament el problema en termes d'un Markov Decision Process (MDP). En concret, descric dos MDP's finits la solució dels quals proporciona una solució a una part del problema del QCL. El problema principal és dissenyar un mètode que permeti efectivament resoldre aquests MDP's, ni que sigui de manera aproximada. En el treball es discuteixen dues aproximacions al problema. La primera d'elles utilitza una variant de l'algoritme usat per AlphaZero, dissenyat amb l'objectiu d'entrenar a una màquina per tal que aprengui a jugar als jocs d'Escacs, Shogi i Go. La segona utilitza una aproximació més estàndard coneguda com a Deep Q-Learning (DQL).One of the challenges in quantum computing is the problem of optimizing quantum circuit compilation. The compilation process involves two main stages: synthesizing the circuit to be executed in terms of the quantum gates supported by the processor, and adapting the circuit to the connectivity limitations imposed by the processor. In this work, I have addressed the second of these problems, known as Quantum Circuit Layout (QCL). To tackle this problem, I have attempted to use Reiforcement Learning (RL) techniques, which require modeling the problem as a Markov Decision Process (MDP). Specifically, I describe two finite MDPs whose solution provides a solution to a part of the QCL problem. The main problem is to design a method that effectively solves these MDPs, even if it is only an approximate solution. In the thesis two approaches to the problem are discussed. The first one uses a variant of the algorithm used in AlphaZero, designed to train a machine to learn how to play Chess, Shogi, and Go. The second approach uses a more standard approximation known as Deep Q-Learning (DQL)

    Trinity College Bulletin, 1980-1981 (Report of the President)

    Get PDF
    https://digitalrepository.trincoll.edu/bulletin/1662/thumbnail.jp

    Rewindable Quantum Computation and Its Equivalence to Cloning and Adaptive Postselection

    Get PDF
    We define rewinding operators that invert quantum measurements. Then, we define complexity classes RwBQP{\sf RwBQP}, CBQP{\sf CBQP}, and AdPostBQP{\sf AdPostBQP} as sets of decision problems solvable by polynomial-size quantum circuits with a polynomial number of rewinding operators, cloning operators, and adaptive postselections, respectively. Our main result is that BPPPPRwBQP=CBQP=AdPostBQPPSPACE{\sf BPP}^{\sf PP}\subseteq{\sf RwBQP}={\sf CBQP}={\sf AdPostBQP}\subseteq{\sf PSPACE}. As a byproduct of this result, we show that any problem in PostBQP{\sf PostBQP} can be solved with only postselections of outputs whose probabilities are polynomially close to one. Under the strongly believed assumption that BQPSZK{\sf BQP}\nsupseteq{\sf SZK}, or the shortest independent vectors problem cannot be efficiently solved with quantum computers, we also show that a single rewinding operator is sufficient to achieve tasks that are intractable for quantum computation. In addition, we consider rewindable Clifford and instantaneous quantum polynomial time circuits.Comment: 29 pages, 3 figures, v2: Added Result 3 and improved Result
    corecore