13 research outputs found

    Paraconsistent Sensitivity Analysis for Bayesian Significance Tests

    Get PDF
    In this paper, the notion of degree of inconsistency is introduced as a tool to evaluate the sensitivity of the Full Bayesian Significance Test (FBST) value of evidence with respect to changes in the prior or reference density. For that, both the definition of the FBST, a possibilistic approach to hypothesis testing based on Bayesian probability procedures, and the use of bilattice structures, as introduced by Ginsberg and Fitting, in paraconsistent logics, are reviewed. The computational and theoretical advantages of using the proposed degree of inconsistency based sensitivity evaluation as an alternative to traditional statistical power analysis is also discussed

    FBST for Mixture Model Selection.

    Get PDF
    The Fully Bayesian Significance Test (FBST) is a coherent Bayesian significance test for sharp hypotheses. This paper proposes the FBST as a model selection tool for general mixture models, and compares its performance with Mclust, a model-based clustering software. The FBST robust performance strongly encourages further developments and investigations

    Intentional Sampling by Goal Optimization with Decoupling by Stochastic Perturbation

    Get PDF
    Intentional sampling methods are non-probabilistic procedures that select a group of individuals for a sample with the purpose of meeting specific prescribed criteria. Intentional sampling methods are intended for exploratory research or pilot studies where tight budget constraints preclude the use of traditional randomized representative sampling. The possibility of subsequently generalize statistically from such deterministic samples to the general population has been the issue of long standing arguments and debates. Nevertheless, the intentional sampling techniques developed in this paper explore pragmatic strategies for overcoming some of the real or perceived shortcomings and limitations of intentional sampling in practical applications

    Paraconsistent probabilities: consistency, contradictions and bayes' theorem

    Get PDF
    2010/51038-0sem informaçãoThis paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs). We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes' theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.This paper represents the first steps towards constructing a paraconsistent theory of probability based on the logics of formal inconsistency (LFIs). We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes' theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.189FAPESP - FUNDAÇÃO DE AMPARO À PESQUISA DO ESTADO DE SÃO PAULOCNPQ - CONSELHO NACIONAL DE DESENVOLVIMENTO CIENTIFICO E TECNOLOGICOFAPESP - FUNDAÇÃO DE AMPARO À PESQUISA DO ESTADO DE SÃO PAULOCNPQ - CONSELHO NACIONAL DE DESENVOLVIMENTO CIENTIFICO E TECNOLOGICO2010/51038-0sem informaçã

    Logically-consistent hypothesis testing and the hexagon of oppositions

    Get PDF
    Although logical consistency is desirable in scientific research, standard statistical hypothesis tests are typically logically inconsistent. To address this issue, previous work introduced agnostic hypothesis tests and proved that they can be logically consistent while retaining statistical optimality properties. This article characterizes the credal modalities in agnostic hypothesis tests and uses the hexagon of oppositions to explain the logical relations between these modalities. Geometric solids that are composed of hexagons of oppositions illustrate the conditions for these modalities to be logically consistent. Prisms composed of hexagons of oppositions show how the credal modalities obtained from two agnostic tests vary according to their threshold values. Nested hexagons of oppositions summarize logical relations between the credal modalities in these tests and prove new relations

    Constructive Verification, Empirical Induction, and Falibilist Deduction: A Threefold Contrast

    Get PDF
    This article explores some open questions related to the problem of verification of theories in the context of empirical sciences by contrasting three epistemological frameworks. Each of these epistemological frameworks is based on a corresponding central metaphor, namely: (a) Neo-empiricism and the gambling metaphor; (b) Popperian falsificationism and the scientific tribunal metaphor; (c) Cognitive constructivism and the object as eigen-solution metaphor. Each of one of these epistemological frameworks has also historically co-evolved with a certain statistical theory and method for testing scientific hypotheses, respectively: (a) Decision theoretic Bayesian statistics and Bayes factors; (b) Frequentist statistics and p-values; (c) Constructive Bayesian statistics and e-values. This article examines with special care the Zero Probability Paradox (ZPP), related to the verification of sharp or precise hypotheses. Finally, this article makes some remarks on Lakatos’ view of mathematics as a quasi-empirical science

    Bayesian epistemic values: focus on surprise, measure probability!

    Get PDF
    The e-value or epistemic value, ev(H), measures the statistical significance of H, a hypothesis about the parameter θ of a Bayesian model. The e-value is obtained by a probability-possibility transformation of the model’s posterior measure, p(θ), and can, in turn, be used to define the FBST or Full Bayesian Significance Test. This article investigates the relation of this novel approach to more standard probability-possibility transformations. In particular, we show how and why the e-value focus on or conforms with s(θ)=p(θ)/r(θ), the model's surprise function relative to the reference density r(θ), while it keeps itself consistent with the model’s posterior probability measure. In addition, we investigate traditional objections raised in decision theoretic Bayesian statistics against measures of significance engendered by probability-possibility transformations

    Spencer-Brown vs. Probability and Statistics: Entropy’s Testimony on Subjective and Objective Randomness.

    Get PDF
    This article analyzes the role of entropy in Bayesian statistics, focusing on its use as a tool for detection, recognition and validation of eigen-solutions. “Objects as eigen-solutions” is a key metaphor of the cognitive constructivism epistemological framework developed by the philosopher Heinz von Foerster. Special attention is given to some objections to the concepts of probability, statistics and randomization posed by George Spencer-Brown, a figure of great influence in the field of radical constructivism

    Symmetry, Invariance and Ontology in Physics and Statistics

    Get PDF
    This paper has three main objectives: (a) Discuss the formal analogy between some important symmetry-invariance arguments used in physics, probability and statistics. Specifically, we will focus on Noether’s theorem in physics, the maximum entropy principle in probability theory, and de Finetti-type theorems in Bayesian statistics; (b) Discuss the epistemological and ontological implications of these theorems, as they are interpreted in physics and statistics. Specifically, we will focus on the positivist (in physics) or subjective (in statistics) interpretations vs. objective interpretations that are suggested by symmetry and invariance arguments; (c) Introduce the cognitive constructivism epistemological framework as a solution that overcomes the realism-subjectivism dilemma and its pitfalls. The work of the physicist and philosopher Max Born will be particularly important in our discussion

    Cognitive Constructivism, Eigen-Solutions, and Sharp Statistical Hypotheses

    Get PDF
    In this paper epistemological, ontological and sociological questions concerning the statistical significance of sharp hypotheses in scientific research are investigated within the framework provided by Cognitive Constructivism and the FBST (Full Bayesian Significance Test). The constructivist framework is contrasted with the traditional epistemological settings for orthodox Bayesian and frequentist statistics provided by Decision Theory and Falsificationism
    corecore