668 research outputs found

    Eventology versus contemporary theories of uncertainty

    Get PDF
    The development of probability theory together with the Bayesian approach in the three last centuries is caused by two factors: the variability of the physical phenomena and partial ignorance about them. As now it is standard to believe [Dubois, 2007], the nature of these key factors is so various, that their descriptions are required special uncertainty theories, which differ from the probability theory and the Bayesian credo, and provide a better account of the various facets of uncertainty by putting together probabilistic and set-valued representations of information to catch a distinction between variability and ignorance. Eventology [Vorobyev, 2007], a new direction of probability theory and philosophy, offers the original event approach to the description of variability and ignorance, entering an agent, together with his/her beliefs, directly in the frameworks of scientific research in the form of eventological distribution of his/her own events. This allows eventology, by putting together probabilistic and set-event representation of information and philosophical concept of event as co-being [Bakhtin, 1920], to provide a unified strong account of various aspects of uncertainty catching distinction between variability and ignorance and opening an opportunity to define imprecise probability as a probability of imprecise event in the mathematical frameworks of Kolmogorov's probability theory [Kolmogorov, 1933].uncertainty, probability, event, co-being, eventology, imprecise event

    Kolmogorov Similarity Hypotheses for Scalar Fields: Sampling Intermittent Turbulent Mixing in the Ocean and Galaxy

    Full text link
    Kolmogorov's three universal similarity hypotheses are extrapolated to describe scalar fields like temperature mixed by turbulence. By the analogous Kolmogorov third hypothesis for scalars, temperature dissipation rates chi averaged over lengths r > L_K should be lognormally distributed with intermittency factors I that increase with increasing turbulence energy length scales L_O as I_chi-r = m_T ln(L_O/r). Tests of Kolmogorovian velocity and scalar universal similarity hypotheses for very large ranges of turbulence length and time scales are provided by data from the ocean and the Galactic interstellar medium. The universal constant for turbulent mixing intermittency m_T is estimated from oceanic data to be 0.44+-0.01, which is remarkably close to estimates for Kolmogorov's turbulence intermittency constant m_u of 0.45+-0.05 from Galactic as well as atmospheric data. Extreme intermittency complicates the oceanic sampling problem, and may lead to quantitative and qualitative undersampling errors in estimates of mean oceanic dissipation rates and fluxes. Intermittency of turbulence and mixing in the interstellar medium may be a factor in the formation of stars.Comment: 23 pages original of Proc. Roy. Soc. article, 8 figures; in "Turbulence and Stochastic Processes: Kolmogorov's ideas 50 years on", London The Royal Society, 1991, J.C.R. Hunt, O.M. Phillips, D. Williams Eds., pages 1-240, vol. 434 (no. 1890) Proc. Roy. Soc. Lond. A, PDF fil

    The possibility of impossible stairways and greener grass

    Get PDF
    In classical game theory, players have finitely many actions and evaluate outcomes of mixed strategies using a von Neumann-Morgenstern utility function. Allowing a larger, but countable, player set introduces a host of phenomena that are impossible in finite games. Firstly, in coordination games, all players have the same preferences: switching to a weakly dominant action makes everyone at least as well off as before. Nevertheless, there are coordination games where the best outcome occurs if everyone chooses a weakly dominated action, while the worst outcome occurs if everyone chooses the weakly dominant action. Secondly, the location of payoff-dominant equilibria behaves capriciously: two coordination games that look so much alike that even the consequences of unilateral deviations are the same may nevertheless have disjoint sets of payoff-dominant equilibria. Thirdly, a large class of games has no (pure or mixed) Nash equilibria. Following the proverb ``the grass is always greener on the other side of the hedge'', greener-grass games model constant discontent: in one part of the strategy space, players would rather switch to its complement. Once there, they'd rather switch back.coordination games; dominant strategies; payoff-dominance; nonexistence of equilibrium; tail events

    The Possibility of Impossible Stairways and Greener Grass

    Get PDF
    In classical game theory, players have finitely many actions and evaluate outcomes of mixed strategies using a von Neumann-Morgenstern utility function. Allowing a larger, but countable, player set introduces a host of phenomena that are impossible in finite games. Firstly, in coordination games, all players have the same preferences: switching to a weakly dominant action makes everyone at least as well off as before. Nevertheless, there are coordina- tion games where the best outcome occurs if everyone chooses a weakly dominated action, while the worst outcome occurs if everyone chooses the weakly dominant action. Secondly, the location of payoff-dominant equilibria behaves capriciously: two coordination games that look so much alike that even the consequences of unilateral deviations are the same may nevertheless have disjoint sets of payoff-dominant equilibria. Thirdly, a large class of games has no (pure or mixed) Nash equilibria. Following the proverb \the grass is always greener on the other side of the hedge", greener-grass games model constant discontent: in one part of the strategy space, players would rather switch to its complement. Once there, they'd rather switch back.coordination games;dominant strategies;payoff-dominance;nonexistence of equi- librium;tail events

    Bell's Inequality Violations: Relation with de Finetti's Coherence Principle and Inferential Analysis of Experimental Data

    Get PDF
    It is often believed that de Finetti's coherence principle naturally leads, in the nite case, to the Kolmogorov's probability theory of random phenomena, which then implies Bell's inequality. Thus, not only a violation of Bell's inequality looks paradoxical in the Kolmogorovian framework, but it should violate also de Finetti's coherence principle. Firstly, we show that this is not the case: the typical theoretical violations of Bell's inequality in quantum physics are in agreement with de Finetti's coherence principle. Secondly, we look for statistical evidence of such violations: we consider the experimental data of measurements of polarization of photons, performed to verify empirically violations of Bell's inequality, and, on the basis of the estimated violation, we test the null hypothesis of Kolmogorovianity for the observed phenomenon. By standard inferential techniques we compute the p-value for the test and get a clear strong conclusion against the Kolmogorovian hypothesis

    Interpretive Implications of the Sample Space

    Get PDF
    In this paper I claim that Kolmogorov's probability theory has other basic notions in addition to 'probability' and 'event'. These notions are described by the sample space component of his probability space structure. This claim has several interesting consequences, two of which I discuss in this paper. The major consequence is that the main interpretations of probability theory are in fact not interpretations of Kolmogorov's theory, simply because an interpretation of a mathematical theory in a strict sense must explicate all of the theory's basic notions, while the main interpretations of probability do not explicate all of Kolmogorov's theory’s basic notions. In particular, the main interpretations only explicate 'probability' and 'event' and do not explicitly address the additional basic notions which I claim Kolmogorov's theory includes. The other important consequence of my claim concerns the relation between 'probability' and 'event'. Very roughly, contrary to the common conception of 'events' as independent of 'probabilities', I claim that in some cases they do depend on them

    Interpretive Implications of the Sample Space

    Get PDF
    In this paper I claim that Kolmogorov's probability theory has other basic notions in addition to 'probability' and 'event'. These notions are described by the sample space component of his probability space structure. This claim has several interesting consequences, two of which I discuss in this paper. The major consequence is that the main interpretations of probability theory are in fact not interpretations of Kolmogorov's theory, simply because an interpretation of a mathematical theory in a strict sense must explicate all of the theory's basic notions, while the main interpretations of probability do not explicate all of Kolmogorov's theory’s basic notions. In particular, the main interpretations only explicate 'probability' and 'event' and do not explicitly address the additional basic notions which I claim Kolmogorov's theory includes. The other important consequence of my claim concerns the relation between 'probability' and 'event'. Very roughly, contrary to the common conception of 'events' as independent of 'probabilities', I claim that in some cases 'probabilities' can determine 'events'

    Reliability and maintainability assessment factors for reliable fault-tolerant systems

    Get PDF
    A long term goal of the NASA Langley Research Center is the development of a reliability assessment methodology of sufficient power to enable the credible comparison of the stochastic attributes of one ultrareliable system design against others. This methodology, developed over a 10 year period, is a combined analytic and simulative technique. An analytic component is the Computer Aided Reliability Estimation capability, third generation, or simply CARE III. A simulative component is the Gate Logic Software Simulator capability, or GLOSS. The numerous factors that potentially have a degrading effect on system reliability and the ways in which these factors that are peculiar to highly reliable fault tolerant systems are accounted for in credible reliability assessments. Also presented are the modeling difficulties that result from their inclusion and the ways in which CARE III and GLOSS mitigate the intractability of the heretofore unworkable mathematics

    Care 3, Phase 1, volume 1

    Get PDF
    A computer program to aid in accessing the reliability of fault tolerant avionics systems was developed. A simple mathematical expression was used to evaluate the reliability of any redundant configuration over any interval during which the failure rates and coverage parameters remained unaffected by configuration changes. Provision was made for convolving such expressions in order to evaluate the reliability of a dual mode system. A coverage model was also developed to determine the various relevant coverage coefficients as a function of the available hardware and software fault detector characteristics, and subsequent isolation and recovery delay statistics
    corecore