1,256 research outputs found

    Being Realist about Bayes, and the Predictive Processing Theory of Mind

    Get PDF
    Some naturalistic philosophers of mind subscribing to the predictive processing theory of mind have adopted a realist attitude towards the results of Bayesian cognitive science. In this paper, we argue that this realist attitude is unwarranted. The Bayesian research program in cognitive science does not possess special epistemic virtues over alternative approaches for explaining mental phenomena involving uncertainty. In particular, the Bayesian approach is not simpler, more unifying, or more rational than alternatives. It is also contentious that the Bayesian approach is overall better supported by the empirical evidence. So, to develop philosophical theories of mind on the basis of a realist interpretation of results from Bayesian cognitive science is unwarranted. Naturalistic philosophers of mind should instead adopt an anti-realist attitude towards these results and remain agnostic as to whether Bayesian models are true. For continuing on with an exclusive focus and praise of Bayes within debates about the predictive processing theory will impede progress in philosophical understanding of scientific practice in computational cognitive science as well as of the architecture of the mind

    Inadequacy of Modal Logic in Quantum Settings

    Full text link
    We test the principles of classical modal logic in fully quantum settings. Modal logic models our reasoning in multi-agent problems, and allows us to solve puzzles like the muddy children paradox. The Frauchiger-Renner thought experiment highlighted fundamental problems in applying classical reasoning when quantum agents are involved; we take it as a guiding example to test the axioms of classical modal logic. In doing so, we find a problem in the original formulation of the Frauchiger-Renner theorem: a missing assumption about unitarity of evolution is necessary to derive a contradiction and prove the theorem. Adding this assumption clarifies how different interpretations of quantum theory fit in, i.e., which properties they violate. Finally, we show how most of the axioms of classical modal logic break down in quantum settings, and attempt to generalize them. Namely, we introduce constructions of trust and context, which highlight the importance of an exact structure of trust relations between agents. We propose a challenge to the community: to find conditions for the validity of trust relations, strong enough to exorcise the paradox and weak enough to still recover classical logic.Comment: In Proceedings QPL 2018, arXiv:1901.0947

    Bayesian Cognitive Science, Monopoly, and Neglected Frameworks

    Get PDF
    A widely shared view in the cognitive sciences is that discovering and assessing explanations of cognitive phenomena whose production involves uncertainty should be done in a Bayesian framework. One assumption supporting this modelling choice is that Bayes provides the best approach for representing uncertainty. However, it is unclear that Bayes possesses special epistemic virtues over alternative modelling frameworks, since a systematic comparison has yet to be attempted. Currently, it is then premature to assert that cognitive phenomena involving uncertainty are best explained within the Bayesian framework. As a forewarning, progress in cognitive science may be hindered if too many scientists continue to focus their efforts on Bayesian modelling, which risks to monopolize scientific resources that may be better allocated to alternative approaches

    A Conceptual Model of Exploration Wayfinding: An Integrated Theoretical Framework and Computational Methodology

    Get PDF
    This thesis is an attempt to integrate contending cognitive approaches to modeling wayfinding behavior. The primary goal is to create a plausible model for exploration tasks within indoor environments. This conceptual model can be extended for practical applications in the design, planning, and Social sciences. Using empirical evidence a cognitive schema is designed that accounts for perceptual and behavioral preferences in pedestrian navigation. Using this created schema, as a guiding framework, the use of network analysis and space syntax act as a computational methods to simulate human exploration wayfinding in unfamiliar indoor environments. The conceptual model provided is then implemented in two ways. First of which is by updating an existing agent-based modeling software directly. The second means of deploying the model is using a spatial interaction model that distributed visual attraction and movement permeability across a graph-representation of building floor plans

    Can American Constitutional Law Be Postmodern?

    Get PDF

    Death of Paradox: The Killer Logic beneath the Standards of Proof

    Get PDF
    The prevailing but contested view of proof standards is that factfinders should determine facts by probabilistic reasoning. Given imperfect evidence, they should ask themselves what they think the chances are that the burdened party would be right if the truth were to become known; they then compare those chances to the applicable standard of proof. I contend that for understanding the standards of proof, the modern versions of logic — in particular, fuzzy logic and belief functions — work better than classical probability. This modern logic suggests that factfinders view evidence of an imprecisely perceived and described reality to form a fuzzy degree of belief in a fact’s existence; they then apply the standard of proof in accordance with the theory of belief functions, by comparing their belief in a fact’s existence to their belief in its negation. This understanding explains how the standard of proof actually works in the law world. It gives a superior mental image of the factfinders’ task, conforms more closely to what we know of people’s cognition, and captures better what the law says its standards are and how it manipulates them. One virtue of this conceptualization is that it is not a radically new view. Another virtue is that it nevertheless manages to resolve some stubborn problems of proof, including the infamous conjunction paradox

    Death of Paradox: The Killer Logic Beneath the Standards of Proof

    Get PDF
    The prevailing but contested view of proof standards is that factfinders should determine facts by probabilistic reasoning. Given imperfect evidence, they should ask themselves what they think the chances are that the burdened party would be right if the truth were to become known; they then compare those chances to the applicable standard of proof. I contend that for understanding the standards of proof, the modern versions of logic — in particular, fuzzy logic and belief functions — work better than classical probability. This modern logic suggests that factfinders view evidence of an imprecisely perceived and described reality to form a fuzzy degree of belief in a fact’s existence; they then apply the standard of proof in accordance with the theory of belief functions, by comparing their belief in a fact’s existence to their belief in its negation. This understanding explains how the standard of proof actually works in the law world. It gives a superior mental image of the factfinders’ task, conforms more closely to what we know of people’s cognition, and captures better what the law says its standards are and how it manipulates them. One virtue of this conceptualization is that it is not a radically new view. Another virtue is that it nevertheless manages to resolve some stubborn problems of proof, including the infamous conjunction paradox

    Review of risk and uncertainty concepts for climate change assessments including human dimensions

    Get PDF
    Soumis à Philosophy StudiesThis paper discusses aspects of risk and uncertainty relevant in an interdisciplinary assessment of climate change policy. It opposes not only the objective approach versus the subjective approach, but also situations when precise probabilities are well founded versus situations of broader forms of error such as Knightian or deep uncertainty, incompleteness, vagueness. Additional human and social dimensions of ignorance: strategic uncertainties, surprises, values diversity, and taboos, are discussed. We argue that the broader forms of error affect all sciences, including those studying Nature. For these aspects the IPCC guidance notes provides an interdisciplinary unified approach on risk and uncertainty. This is a significant advance from a simple multidisciplinary justaposition of approaches. However, these guidance notes are not universal, they mostly omit the human and social dimensions of ignorance.Ce papier discute les divers aspects du risque et de l'incertitude pertinents dans le cadre de l'évaluation interdisciplinaire des politiques climatiques. Il marque non seulement l'opposition entre l'approche objective (qui voit les probabilités comme des degrés de vérité) et l'approche bayésienne (qui les voit comme des degrés de certitude), mais encore l'opposition entre les situations de risque (quand on dispose de probabilités précises et bien fondées) et les situations d'incertitude (des formes d'ignorance plus générale, comme l'incertitude au sens de Knight, l'incomplétude ou le vague). L'évolution des directives IPCC sur le risque et l'incertitude entre le troisième et le quatrième rapport peuvent se lire comme un mouvement s'écartant de la position objectiviste et probabiliste, pour inclure des aspects plus complexes de l'incertitude. Cependant, il reste encore des dimensions humaines comme l'ignorance stratégique, les surprises, les aspects métaphysiques, les taboos et l'incertitude épistémique qui manquent dans les directives IPCC

    What we talk about when we talk about uncertainty. Toward a unified, data-driven framework for uncertainty characterization in hydrogeology

    Get PDF
    In this manuscript, we compare and discuss different frameworks for hydrogeological uncertainty analysis. Since uncertainty is a property of knowledge, we base this comparison on purely epistemological concepts. In a detailed comparison between different candidates, we make the case for Bayesianism, i.e., the framework of reasoning about uncertainty using probability theory. We motivate the use of Bayesian tools, shortly explain the properties of Bayesian inference, prediction and decision and identify the most pressing current challenges of this framework. In hydrogeology, these challenges are the derivation of prior distributions for the parametric uncertainty, typically hydraulic conductivity values, as well as the most relevant paradigm for generating subsurface structures for assessing the structural uncertainty. We present the most commonly used paradigms and give detailed advice on two specific paradigms; Gaussian multivariate random fields as well as multiple-point statistics, both of which have benefits and drawbacks. Without settling for either of these paradigms, we identify the lack of open-access data repositories as the most pressing current impediment for the advancement of data-driven uncertainty analysis. We detail the shortcomings of the current situation and describe a number of steps which could foster the application of both the Gaussian as well as the multiple-point paradigm. We close the manuscript with a call for a community-wide initiative to create this necessary support
    • …
    corecore