3,028 research outputs found

    Imprecise Bernoulli processes

    Get PDF
    In classical Bernoulli processes, it is assumed that a single Bernoulli experiment can be described by a precise and precisely known probability distribution. However, both of these assumptions can be relaxed. A first approach, often used in sensitivity analysis, is to drop only the second assumption: one assumes the existence of a precise distribution, but has insufficient resources to determine it precisely. The resulting imprecise Bernoulli process is the lower envelope of a set of precise Bernoulli processes. An alternative approach is to drop both assumptions, meaning that we don't assume the existence of a precise probability distribution and regard the experiment as inherently imprecise. In that case, a single imprecise Bernoulli experiment can be described by a set of desirable gambles. We show how this set can be extended to describe an imprecise Bernoulli process, by imposing the behavioral assessments of epistemic independence and exchangeability. The resulting analysis leads to surprisingly simple mathematical expressions characterizing this process, which turn out to be the same as the ones obtained through the straightforward sensitivity analysis approach

    Imprecise Bernoulli processes

    Get PDF
    In classical Bernoulli processes, it is assumed that a single Bernoulli experiment can be described by a precise and precisely known probability distribution. However, both of these assumptions can be relaxed. A first approach, often used in sensitivity analysis, is to drop only the second assumption: one assumes the existence of a precise distribution, but has insufficient resources to determine it precisely. The resulting imprecise Bernoulli process is the lower envelope of a set of precise Bernoulli processes. An alternative approach is to drop both assumptions, meaning that we don't assume the existence of a precise probability distribution and regard the experiment as inherently imprecise. In that case, a single imprecise Bernoulli experiment can be described by a set of desirable gambles. We show how this set can be extended to describe an imprecise Bernoulli process, by imposing the behavioral assessments of epistemic independence and exchangeability. The resulting analysis leads to surprisingly simple mathematical expressions characterizing this process, which turn out to be the same as the ones obtained through the straightforward sensitivity analysis approach

    Empirical interpretation of imprecise probabilities

    Get PDF
    This paper investigates the possibility of a frequentist interpretation of imprecise probabilities, by generalizing the approach of Bernoulli’s Ars Conjectandi. That is, by studying, in the case of games of chance, under which assumptions imprecise probabilities can be satisfactorily estimated from data. In fact, estimability on the basis of finite amounts of data is a necessary condition for imprecise probabilities in order to have a clear empirical meaning. Unfortunately, imprecise probabilities can be estimated arbitrarily well from data only in very limited settings

    A Box Particle Filter for Stochastic and Set-theoretic Measurements with Association Uncertainty

    Get PDF
    This work develops a novel estimation approach for nonlinear dynamic stochastic systems by combining the sequential Monte Carlo method with interval analysis. Unlike the common pointwise measurements, the proposed solution is for problems with interval measurements with association uncertainty. The optimal theoretical solution can be formulated in the framework of random set theory as the Bernoulli filter for interval measurements. The straightforward particle filter implementation of the Bernoulli filter typically requires a huge number of particles since the posterior probability density function occupies a significant portion of the state space. In order to reduce the number of particles, without necessarily sacrificing estimation accuracy, the paper investigates an implementation based on box particles. A box particle occupies a small and controllable rectangular region of non-zero volume in the target state space. The numerical results demonstrate that the filter performs remarkably well: both target state and target presence are estimated reliably using a very small number of box particles

    Imprecise multinomial processes: an overview of different approaches and how they are related to each other

    Get PDF
    In this overview, we present and compare four different approaches to imprecise multinomial processes, which are generalizations of the classical multinomial process to the field of imprecise probability theory. Within this field, one can choose between a number of different mathematical frameworks. Amongst the most important ones, we have credal sets, coherent lower previsions and coherent sets of desirable gambles. We show how each of them can be used to model beliefs about the outcome of a single experiment. We give an overview of different ways of extending these local models to describe an infinite sequence of experiments, leading to four different types of imprecise multinomial processes. We investigate their properties, discuss the assumptions that underly them and show how they can be related to one another by imposing additional requirements. In particular, it turns out that by additionally imposing exchangeability, all four types of imprecise multinomial processes coincide, which ultimately provides us with a behavioural justification for applying sensitivity analysis to classical multinomial processes

    First steps towards an imprecise Poisson process

    Get PDF
    The Poisson process is the most elementary continuous-time stochastic process that models a stream of repeating events. It is uniquely characterised by a single parameter called the rate. Instead of a single value for this rate, we here consider a rate interval and let it characterise two nested sets of stochastic processes. We call these two sets of stochastic process imprecise Poisson processes, explain why this is justified, and study the corresponding lower and upper (conditional) expectations. Besides a general theoretical framework, we also provide practical methods to compute lower and upper (conditional) expectations of functions that depend on the number of events at a single point in time

    Eventology versus contemporary theories of uncertainty

    Get PDF
    The development of probability theory together with the Bayesian approach in the three last centuries is caused by two factors: the variability of the physical phenomena and partial ignorance about them. As now it is standard to believe [Dubois, 2007], the nature of these key factors is so various, that their descriptions are required special uncertainty theories, which differ from the probability theory and the Bayesian credo, and provide a better account of the various facets of uncertainty by putting together probabilistic and set-valued representations of information to catch a distinction between variability and ignorance. Eventology [Vorobyev, 2007], a new direction of probability theory and philosophy, offers the original event approach to the description of variability and ignorance, entering an agent, together with his/her beliefs, directly in the frameworks of scientific research in the form of eventological distribution of his/her own events. This allows eventology, by putting together probabilistic and set-event representation of information and philosophical concept of event as co-being [Bakhtin, 1920], to provide a unified strong account of various aspects of uncertainty catching distinction between variability and ignorance and opening an opportunity to define imprecise probability as a probability of imprecise event in the mathematical frameworks of Kolmogorov's probability theory [Kolmogorov, 1933].uncertainty, probability, event, co-being, eventology, imprecise event

    Weakly-Supervised Temporal Localization via Occurrence Count Learning

    Get PDF
    We propose a novel model for temporal detection and localization which allows the training of deep neural networks using only counts of event occurrences as training labels. This powerful weakly-supervised framework alleviates the burden of the imprecise and time-consuming process of annotating event locations in temporal data. Unlike existing methods, in which localization is explicitly achieved by design, our model learns localization implicitly as a byproduct of learning to count instances. This unique feature is a direct consequence of the model's theoretical properties. We validate the effectiveness of our approach in a number of experiments (drum hit and piano onset detection in audio, digit detection in images) and demonstrate performance comparable to that of fully-supervised state-of-the-art methods, despite much weaker training requirements.Comment: Accepted at ICML 201
    corecore