175 research outputs found

    From Entropic Dynamics to Quantum Theory

    Full text link
    Non-relativistic quantum theory is derived from information codified into an appropriate statistical model. The basic assumption is that there is an irreducible uncertainty in the location of particles: positions constitute a configuration space and the corresponding probability distributions constitute a statistical manifold. The dynamics follows from a principle of inference, the method of Maximum Entropy. The concept of time is introduced as a convenient way to keep track of change. A welcome feature is that the entropic dynamics notion of time incorporates a natural distinction between past and future. The statistical manifold is assumed to be a dynamical entity: its curved and evolving geometry determines the evolution of the particles which, in their turn, react back and determine the evolution of the geometry. Imposing that the dynamics conserve energy leads to the Schroedinger equation and to a natural explanation of its linearity, its unitarity, and of the role of complex numbers. The phase of the wave function is explained as a feature of purely statistical origin. There is a quantum analogue to the gravitational equivalence principle.Comment: Extended and corrected version of a paper presented at MaxEnt 2009, the 29th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering (July 5-10, 2009, Oxford, Mississippi, USA). In version v3 I corrected a mistake and considerably simplified the argument. The overall conclusions remain unchange

    Jaynes' MaxEnt, Steady State Flow Systems and the Maximum Entropy Production Principle

    Full text link
    Jaynes' maximum entropy (MaxEnt) principle was recently used to give a conditional, local derivation of the ``maximum entropy production'' (MEP) principle, which states that a flow system with fixed flow(s) or gradient(s) will converge to a steady state of maximum production of thermodynamic entropy (R.K. Niven, Phys. Rev. E, in press). The analysis provides a steady state analog of the MaxEnt formulation of equilibrium thermodynamics, applicable to many complex flow systems at steady state. The present study examines the classification of physical systems, with emphasis on the choice of constraints in MaxEnt. The discussion clarifies the distinction between equilibrium, fluid flow, source/sink, flow/reactive and other systems, leading into an appraisal of the application of MaxEnt to steady state flow and reactive systems.Comment: 6 pages; paper for MaxEnt0

    Entropic Priors and Bayesian Model Selection

    Full text link
    We demonstrate that the principle of maximum relative entropy (ME), used judiciously, can ease the specification of priors in model selection problems. The resulting effect is that models that make sharp predictions are disfavoured, weakening the usual Bayesian "Occam's Razor". This is illustrated with a simple example involving what Jaynes called a "sure thing" hypothesis. Jaynes' resolution of the situation involved introducing a large number of alternative "sure thing" hypotheses that were possible before we observed the data. However, in more complex situations, it may not be possible to explicitly enumerate large numbers of alternatives. The entropic priors formalism produces the desired result without modifying the hypothesis space or requiring explicit enumeration of alternatives; all that is required is a good model for the prior predictive distribution for the data. This idea is illustrated with a simple rigged-lottery example, and we outline how this idea may help to resolve a recent debate amongst cosmologists: is dark energy a cosmological constant, or has it evolved with time in some way? And how shall we decide, when the data are in?Comment: Presented at MaxEnt 2009, the 29th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering (July 5-10, 2009, Oxford, Mississippi, USA

    Discriminating between a Stochastic Gravitational Wave Background and Instrument Noise

    Full text link
    The detection of a stochastic background of gravitational waves could significantly impact our understanding of the physical processes that shaped the early Universe. The challenge lies in separating the cosmological signal from other stochastic processes such as instrument noise and astrophysical foregrounds. One approach is to build two or more detectors and cross correlate their output, thereby enhancing the common gravitational wave signal relative to the uncorrelated instrument noise. When only one detector is available, as will likely be the case with the Laser Interferometer Space Antenna (LISA), alternative analysis techniques must be developed. Here we show that models of the noise and signal transfer functions can be used to tease apart the gravitational and instrument noise contributions. We discuss the role of gravitational wave insensitive "null channels" formed from particular combinations of the time delay interferometry, and derive a new combination that maintains this insensitivity for unequal arm length detectors. We show that, in the absence of astrophysical foregrounds, LISA could detect signals with energy densities as low as Ωgw=6×10−13\Omega_{\rm gw} = 6 \times 10^{-13} with just one month of data. We describe an end-to-end Bayesian analysis pipeline that is able to search for, characterize and assign confidence levels for the detection of a stochastic gravitational wave background, and demonstrate the effectiveness of this approach using simulated data from the third round of Mock LISA Data Challenges.Comment: 10 Pages, 10 Figure

    Computational methods for Bayesian model choice

    Full text link
    In this note, we shortly survey some recent approaches on the approximation of the Bayes factor used in Bayesian hypothesis testing and in Bayesian model choice. In particular, we reassess importance sampling, harmonic mean sampling, and nested sampling from a unified perspective.Comment: 12 pages, 4 figures, submitted to the proceedings of MaxEnt 2009, July 05-10, 2009, to be published by the American Institute of Physic

    TI-Stan: Model comparison using thermodynamic integration and HMC

    Get PDF
    © 2019 by the authors. We present a novel implementation of the adaptively annealed thermodynamic integration technique using Hamiltonian Monte Carlo (HMC). Thermodynamic integration with importance sampling and adaptive annealing is an especially useful method for estimating model evidence for problems that use physics-based mathematical models. Because it is based on importance sampling, this method requires an efficient way to refresh the ensemble of samples. Existing successful implementations use binary slice sampling on the Hilbert curve to accomplish this task. This implementation works well if the model has few parameters or if it can be broken into separate parts with identical parameter priors that can be refreshed separately. However, for models that are not separable and have many parameters, a different method for refreshing the samples is needed. HMC, in the form of the MC-Stan package, is effective for jointly refreshing the ensemble under a high-dimensional model. MC-Stan uses automatic differentiation to compute the gradients of the likelihood that HMC requires in about the same amount of time as it computes the likelihood function itself, easing the programming burden compared to implementations of HMC that require explicitly specified gradient functions. We present a description of the overall TI-Stan procedure and results for representative example problems

    Economic aspects of "liberalized depreciation" /

    Get PDF

    Benefits of Using Team Choice for Windows as a Multi-Criteria Decision making Group Decision Support System

    Get PDF
    Various experiments have been conducted over the past ten years using several different types of group decision support systems (GDSSs). Many previous GDSS designs have had success in these experiments with brainstorming but have been limited in providing judgment and choice support. TeamChoice for Windows is a multi-criteria decision making GDSS that is currently under development in an effort to overcome the limitations of previous systems and to significantly advance the capabilities of GDSSs. This paper discusses some of the general aspects of GDSSs, existing limitations, and explains the developments and use of TeamChoice for Windows as a multi-criteria decision making GDSS

    Democracy in Africa: Colonization vs. Modernization

    Get PDF
    A country’s degree of democratic development is the best predictor of economic prosperity. African nations are some of the poorest on the planet and tend to have low levels of democracy, while wealthier nations tend toward higher levels. If Africa is going to increase its economic output, theory suggests one of the best ways to accomplish such a goal is to increase African democracy levels. Why do some countries in Africa develop democracy while others do not? I analyze the Freedom House and Polity IV democracy scores for each country in order to determine which countries are the most democratic and compare them with historical and demographic data, such as political instability events, fragmentation, population, GDP, and colonial history, in order to give a more robust picture of what factors matter most in the development of democracy in Africa. I also analyze data on countries outside of Africa in order to determine whether or not Africa has different prerequisites for democracy than the rest of the world. I theorize literacy rates, urbanization, and elimination of fragmentation may be more important than economic factors in the development of democracy in Africa
    • …
    corecore