572 research outputs found

    Practical Model-Based Diagnosis with Qualitative Possibilistic Uncertainty

    Full text link
    An approach to fault isolation that exploits vastly incomplete models is presented. It relies on separate descriptions of each component behavior, together with the links between them, which enables focusing of the reasoning to the relevant part of the system. As normal observations do not need explanation, the behavior of the components is limited to anomaly propagation. Diagnostic solutions are disorders (fault modes or abnormal signatures) that are consistent with the observations, as well as abductive explanations. An ordinal representation of uncertainty based on possibility theory provides a simple exception-tolerant description of the component behaviors. We can for instance distinguish between effects that are more or less certainly present (or absent) and effects that are more or less certainly present (or absent) when a given anomaly is present. A realistic example illustrates the benefits of this approach.Comment: Appears in Proceedings of the Eleventh Conference on Uncertainty in Artificial Intelligence (UAI1995

    Categorical Quantum Dynamics

    Full text link
    We use strong complementarity to introduce dynamics and symmetries within the framework of CQM, which we also extend to infinite-dimensional separable Hilbert spaces: these were long-missing features, which open the way to a wealth of new applications. The coherent treatment presented in this work also provides a variety of novel insights into the dynamics and symmetries of quantum systems: examples include the extremely simple characterisation of symmetry-observable duality, the connection of strong complementarity with the Weyl Canonical Commutation Relations, the generalisations of Feynman's clock construction, the existence of time observables and the emergence of quantum clocks. Furthermore, we show that strong complementarity is a key resource for quantum algorithms and protocols. We provide the first fully diagrammatic, theory-independent proof of correctness for the quantum algorithm solving the Hidden Subgroup Problem, and show that strong complementarity is the feature providing the quantum advantage. In quantum foundations, we use strong complementarity to derive the exact conditions relating non-locality to the structure of phase groups, within the context of Mermin-type non-locality arguments. Our non-locality results find further application to quantum cryptography, where we use them to define a quantum-classical secret sharing scheme with provable device-independent security guarantees. All in all, we argue that strong complementarity is a truly powerful and versatile building block for quantum theory and its applications, and one that should draw a lot more attention in the future.Comment: Thesis submitted for the degree of Doctor of Philosophy, Oxford University, Michaelmas Term 2016 (273 pages

    Coherent Price Systems and Uncertainty-Neutral Valuation

    Get PDF
    We consider fundamental questions of arbitrage pricing arising when the uncertainty model is given by a set of possible mutually singular probability measures. With a single probability model, essential equivalence between the absence of arbitrage and the existence of an equivalent martingale measure is a folk theorem, see Harrison and Kreps (1979). We establish a microeconomic foundation of sublinear price systems and present an extension result. In this context we introduce a prior dependent notion of marketed spaces and viable price systems. We associate this extension with a canonically altered concept of equivalent symmetric martingale measure sets, in a dynamic trading framework under absence of prior depending arbitrage. We prove the existence of such sets when volatility uncertainty is modeled by a stochastic differential equation, driven by Peng's G-Brownian motions

    The legacy of 50 years of fuzzy sets: A discussion

    Get PDF
    International audienceThis note provides a brief overview of the main ideas and notions underlying fifty years of research in fuzzy set and possibility theory, two important settings introduced by L.A. Zadeh for representing sets with unsharp boundaries and uncertainty induced by granules of information expressed with words. The discussion is organized on the basis of three potential understanding of the grades of membership to a fuzzy set, depending on what the fuzzy set intends to represent: a group of elements with borderline members, a plausibility distribution, or a preference profile. It also questions the motivations for some existing generalized fuzzy sets. This note clearly reflects the shared personal views of its authors

    A framework of distributionally robust possibilistic optimization

    Full text link
    In this paper, an optimization problem with uncertain constraint coefficients is considered. Possibility theory is used to model the uncertainty. Namely, a joint possibility distribution in constraint coefficient realizations, called scenarios, is specified. This possibility distribution induces a necessity measure in scenario set, which in turn describes an ambiguity set of probability distributions in scenario set. The distributionally robust approach is then used to convert the imprecise constraints into deterministic equivalents. Namely, the left-hand side of an imprecise constraint is evaluated by using a risk measure with respect to the worst probability distribution that can occur. In this paper, the Conditional Value at Risk is used as the risk measure, which generalizes the strict robust and expected value approaches, commonly used in literature. A general framework for solving such a class of problems is described. Some cases which can be solved in polynomial time are identified

    Two Forms of Inconsistency in Quantum Foundations

    Get PDF
    Recently, there has been some discussion of how Dutch Book arguments might be used to demonstrate the rational incoherence of certain hidden variable models of quantum theory (Feintzeig and Fletcher 2017). In this paper, we argue that the 'form of inconsistency' underlying this alleged irrationality is deeply and comprehensively related to the more familiar 'inconsistency' phenomenon of contextuality. Our main result is that the hierarchy of contextuality due to Abramsky and Brandenburger (2011) corresponds to a hierarchy of additivity/convexity-violations which yields formal Dutch Books of different strengths. We then use this result to provide a partial assessment of whether these formal Dutch Books can be interpreted normatively.Comment: 26 pages, 5 figure

    Possibilistic uncertainty analysis of a conceptual model of snowmelt runoff

    Get PDF
    This study presents the analysis of predictive uncertainty of a conceptual type snowmelt runoff model. The method applied uses possibilistic rather than probabilistic calculus for the evaluation of predictive uncertainty. Possibility theory is an information theory meant to model uncertainties caused by imprecise or incomplete knowledge about a real system rather than by randomness. A snow dominated catchment in the Chilean Andes is used as case study. Predictive uncertainty arising from parameter uncertainties of the watershed model is assessed. Model performance is evaluated according to several criteria, in order to define the possibility distribution of the parameter vector. The plausibility of the simulated glacier mass balance and snow cover are used for further constraining the model representations. Possibility distributions of the discharge estimates and prediction uncertainty bounds are subsequently derived. The results of the study indicate that the use of additional information allows a reduction of predictive uncertainty. In particular, the assessment of the simulated glacier mass balance and snow cover helps to reduce the width of the uncertainty bounds without a significant increment in the number of unbounded observations

    Empirical White Noise Processes and the Subjective Probabilistic Approaches

    Get PDF
    The paper discusses the identification of the empirical white noise processes generated by deterministic numerical algorithms.The introduced fuzzy-random complementary approach can identify the inner hidden correlational patterns of the empirical white noise process if the process has a real hidden structure of this kind. We have shown how the characteristics of auto-correlated white noise processes change as the order of autocorrelation increases. Although in this paper we rely on random number generators to get approximate white noise processes, in our upcoming research we are planning to turn the focus on physical white noise processes in order to validate our hypothesis
    • …
    corecore