30,499 research outputs found

    Measuring questions: relevance and its relation to entropy

    Full text link
    The Boolean lattice of logical statements induces the free distributive lattice of questions. Inclusion on this lattice is based on whether one question answers another. Generalizing the zeta function of the question lattice leads to a valuation called relevance or bearing, which is a measure of the degree to which one question answers another. Richard Cox conjectured that this degree can be expressed as a generalized entropy. With the assistance of yet another important result from Janos Aczel, I show that this is indeed the case, and that the resulting inquiry calculus is a natural generalization of information theory. This approach provides a new perspective on the Principle of Maximum Entropy.Comment: 8 pages, 1 figure. Presented to the MaxEnt 2004 meeting in Garching Germany. To be published in: R. Fischer, V. Dose (eds.), Bayesian Inference and Maximum Entropy Methods in Science and Engineering, Garching, Germany 2004, AIP Conference Proceedings, American Institute of Physics, Melville N

    Statistical Deterministic and Ensemble Seasonal Prediction of Tropical Cyclones in the Northwest Australian Region

    Get PDF
    Statistical seasonal prediction of tropical cyclones (TCs) has been ongoing for quite some time in many different ocean basins across the world. While a few basins (e.g., North Atlantic and western North Pacific) have been extensively studied and forecasted for many years, Southern Hemispheric TCs have been less frequently studied and generally grouped as a whole or into two primary basins: southern Indian Ocean and Australian. This paper investigates the predictability of TCs in the northwest Australian (NWAUS) basin of the southeast Indian Ocean (105°–135°E) and describes two statistical approaches to the seasonal prediction of TC frequency, TC days, and accumulated cyclone energy (ACE). The first approach is a traditional deterministic seasonal prediction using predictors identified from NCEP–NCAR reanalysis fields using multiple linear regression. The second is a 100-member statistical ensemble approach with the same predictors as the deterministic model but with a resampling of the dataset with replacement and smearing input values to generate slightly different coefficients in the multiple linear regression prediction equations. Both the deterministic and ensemble schemes provide valuable forecasts that are better than climatological forecasts. The ensemble approach outperforms the deterministic model as well as adding quantitative uncertainty that reflects the predictability of a given TC season

    Information Physics: The New Frontier

    Full text link
    At this point in time, two major areas of physics, statistical mechanics and quantum mechanics, rest on the foundations of probability and entropy. The last century saw several significant fundamental advances in our understanding of the process of inference, which make it clear that these are inferential theories. That is, rather than being a description of the behavior of the universe, these theories describe how observers can make optimal predictions about the universe. In such a picture, information plays a critical role. What is more is that little clues, such as the fact that black holes have entropy, continue to suggest that information is fundamental to physics in general. In the last decade, our fundamental understanding of probability theory has led to a Bayesian revolution. In addition, we have come to recognize that the foundations go far deeper and that Cox's approach of generalizing a Boolean algebra to a probability calculus is the first specific example of the more fundamental idea of assigning valuations to partially-ordered sets. By considering this as a natural way to introduce quantification to the more fundamental notion of ordering, one obtains an entirely new way of deriving physical laws. I will introduce this new way of thinking by demonstrating how one can quantify partially-ordered sets and, in the process, derive physical laws. The implication is that physical law does not reflect the order in the universe, instead it is derived from the order imposed by our description of the universe. Information physics, which is based on understanding the ways in which we both quantify and process information about the world around us, is a fundamentally new approach to science.Comment: 17 pages, 6 figures. Knuth K.H. 2010. Information physics: The new frontier. J.-F. Bercher, P. Bessi\`ere, and A. Mohammad-Djafari (eds.) Bayesian Inference and Maximum Entropy Methods in Science and Engineering (MaxEnt 2010), Chamonix, France, July 201

    The Problem of Motion: The Statistical Mechanics of Zitterbewegung

    Full text link
    Around 1930, both Gregory Breit and Erwin Schroedinger showed that the eigenvalues of the velocity of a particle described by wavepacket solutions to the Dirac equation are simply ±\pmc, the speed of light. This led Schroedinger to coin the term Zitterbewegung, which is German for "trembling motion", where all particles of matter (fermions) zig-zag back-and-forth at only the speed of light. The result is that any finite speed less than cc, including the state of rest, only makes sense as a long-term average that can be thought of as a drift velocity. In this paper, we seriously consider this idea that the observed velocities of particles are time-averages of motion at the speed of light and demonstrate how the relativistic velocity addition rule in one spatial dimension is readily derived by considering the probabilities that a particle is observed to move either to the left or to the right at the speed of light.Comment: Knuth K.H. 2014. The problem of motion: the statistical mechanics of Zitterbewegung. Bayesian Inference and Maximum Entropy Methods in Science and Engineering, Amboise, France, Sept 2014, AIP Conference Proceedings, American Institute of Physics, Melville N

    Inferences about Interactions: Fermions and the Dirac Equation

    Full text link
    At a fundamental level every measurement process relies on an interaction where one entity influences another. The boundary of an interaction is given by a pair of events, which can be ordered by virtue of the interaction. This results in a partially ordered set (poset) of events often referred to as a causal set. In this framework, an observer can be represented by a chain of events. Quantification of events and pairs of events, referred to as intervals, can be performed by projecting them onto an observer chain, or even a pair of observer chains, which in specific situations leads to a Minkowski metric replete with Lorentz transformations. We illustrate how this framework of interaction events gives rise to some of the well-known properties of the Fermions, such as Zitterbewegung. We then take this further by making inferences about events, which is performed by employing the process calculus, which coincides with the Feynman path integral formulation of quantum mechanics. We show that in the 1+1 dimensional case this results in the Feynman checkerboard model of the Dirac equation describing a Fermion at rest.Comment: 11 pages, 3 figures. To be published in the MaxEnt 2012 proceeding
    • …
    corecore