5,974 research outputs found

    A Bayesian Analogue of Gleason's Theorem

    Full text link
    We introduce a novel notion of probability within quantum history theories and give a Gleasonesque proof for these assignments. This involves introducing a tentative novel axiom of probability. We also discuss how we are to interpret these generalised probabilities as partially ordered notions of preference and we introduce a tentative generalised notion of Shannon entropy. A Bayesian approach to probability theory is adopted throughout, thus the axioms we use will be minimal criteria of rationality rather than ad hoc mathematical axioms.Comment: 14 pages, v2: minor stylistic changes, v3: changes made in-line with to-be-published versio

    A Bayesian Account of Quantum Histories

    Full text link
    We investigate whether quantum history theories can be consistent with Bayesian reasoning and whether such an analysis helps clarify the interpretation of such theories. First, we summarise and extend recent work categorising two different approaches to formalising multi-time measurements in quantum theory. The standard approach consists of describing an ordered series of measurements in terms of history propositions with non-additive `probabilities'. The non-standard approach consists of defining multi-time measurements to consist of sets of exclusive and exhaustive history propositions and recovering the single-time exclusivity of results when discussing single-time history propositions. We analyse whether such history propositions can be consistent with Bayes' rule. We show that certain class of histories are given a natural Bayesian interpretation, namely the linearly positive histories originally introduced by Goldstein and Page. Thus we argue that this gives a certain amount of interpretational clarity to the non-standard approach. We also attempt a justification of our analysis using Cox's axioms of probability theory.Comment: 24 pages, accepted for publication in Annals of Physics, minor correctio

    Bayesian Probabilities and the Histories Algebra

    Full text link
    We attempt a justification of a generalisation of the consistent histories programme using a notion of probability that is valid for all complete sets of history propositions. This consists of introducing Cox's axioms of probability theory and showing that our candidate notion of probability obeys them. We also give a generalisation of Bayes' theorem and comment upon how Bayesianism should be useful for the quantum gravity/cosmology programmes.Comment: 10 pages, accepted by Int. J. Theo. Phys. Feb 200

    Deep convolutional filtering for spatio-temporal denoising and artifact removal in arterial spin labelling MRI

    Get PDF
    Arterial spin labelling (ASL) is a noninvasive imaging modality, used in the clinic and in research, which can give quantitative measurements of perfusion in the brain and other organs. However, because the signal-to-noise ratio is inherently low and the ASL acquisition is particularly prone to corruption by artifact, image processing methods such as denoising and artifact filtering are vital for generating accurate measurements of perfusion. In this work, we present a new simultaneous approach to denoising and artifact removal, using a novel deep convolutional joint filter architecture to learn and exploit spatio-temporal properties of the ASL signal. We proceed to show, using data from 15 healthy subjects, that our approach achieves state of the art performance in both denoising and artifact removal, improving peak signal-to-noise ratio by up to 50%. By allowing more accurate estimation of perfusion, even in challenging datasets, this technique offers an exciting new approach for ASL pipelines, and might be used both for improving individual images and to increase the power of research studies using ASL
    • …
    corecore