809 research outputs found

    Towards a Formulation of Quantum Theory as a Causally Neutral Theory of Bayesian Inference

    Get PDF
    Quantum theory can be viewed as a generalization of classical probability theory, but the analogy as it has been developed so far is not complete. Whereas the manner in which inferences are made in classical probability theory is independent of the causal relation that holds between the conditioned variable and the conditioning variable, in the conventional quantum formalism, there is a significant difference between how one treats experiments involving two systems at a single time and those involving a single system at two times. In this article, we develop the formalism of quantum conditional states, which provides a unified description of these two sorts of experiment. In addition, concepts that are distinct in the conventional formalism become unified: channels, sets of states, and positive operator valued measures are all seen to be instances of conditional states; the action of a channel on a state, ensemble averaging, the Born rule, the composition of channels, and nonselective state-update rules are all seen to be instances of belief propagation. Using a quantum generalization of Bayes' theorem and the associated notion of Bayesian conditioning, we also show that the remote steering of quantum states can be described within our formalism as a mere updating of beliefs about one system given new information about another, and retrodictive inferences can be expressed using the same belief propagation rule as is used for predictive inferences. Finally, we show that previous arguments for interpreting the projection postulate as a quantum generalization of Bayesian conditioning are based on a misleading analogy and that it is best understood as a combination of belief propagation (corresponding to the nonselective state-update map) and conditioning on the measurement outcome.Comment: v1 43 pages, revTeX4. v2 42 pages, edited for clarity, added references and corrected minor errors, submitted to Phys. Rev. A. v3 41 pages, improved figures, added two new figures, added extra explanation in response to referee comments, minor rewrites for readability. v4 44 pages, added "towards" to title, rewritten abstract, rewritten introduction with new table

    Conic Optimization Theory: Convexification Techniques and Numerical Algorithms

    Full text link
    Optimization is at the core of control theory and appears in several areas of this field, such as optimal control, distributed control, system identification, robust control, state estimation, model predictive control and dynamic programming. The recent advances in various topics of modern optimization have also been revamping the area of machine learning. Motivated by the crucial role of optimization theory in the design, analysis, control and operation of real-world systems, this tutorial paper offers a detailed overview of some major advances in this area, namely conic optimization and its emerging applications. First, we discuss the importance of conic optimization in different areas. Then, we explain seminal results on the design of hierarchies of convex relaxations for a wide range of nonconvex problems. Finally, we study different numerical algorithms for large-scale conic optimization problems.Comment: 18 page

    self-monitoring reading of implicit contents and moral of self

    Get PDF
    The research concerns self-monitoring psychological processes and aims to verify two hypothesises: that there is difference in the theories on Self in HSM and LSM, and that the HSMs are more able implicit readers than the LSMs. The 18 item SMS was administered to 86 people, who had also undergone the implicit reading test. HSM and LSM were then thoroughly studied with further implicit reading trials and by means of dilemmas intended to explore their implicit theories of self. The HSMs read more implicits, in a livelier way, and with less fatigue. The dilemmas show differences in the structure of the self and above all in the moral conviction of self.

    Variational Probabilistic Inference and the QMR-DT Network

    Full text link
    We describe a variational approximation method for efficient inference in large-scale probabilistic models. Variational methods are deterministic procedures that provide approximations to marginal and conditional probabilities of interest. They provide alternatives to approximate inference methods based on stochastic sampling or search. We describe a variational approach to the problem of diagnostic inference in the `Quick Medical Reference' (QMR) network. The QMR network is a large-scale probabilistic graphical model built on statistical and expert knowledge. Exact probabilistic inference is infeasible in this model for all but a small set of cases. We evaluate our variational inference algorithm on a large set of diagnostic test cases, comparing the algorithm to a state-of-the-art stochastic sampling method
    corecore