33 research outputs found

    Probabilities with Gaps and Gluts

    Get PDF
    Belnap-Dunn logic (BD), sometimes also known as First Degree Entailment, is a four-valued propositional logic that complements the classical truth values of True and False with two non-classical truth values Neither and Both. The latter two are to account for the possibility of the available information being incomplete or providing contradictory evidence. In this paper, we present a probabilistic extension of BD that permits agents to have probabilistic beliefs about the truth and falsity of a proposition. We provide a sound and complete axiomatization for the framework defined and also identify policies for conditionalization and aggregation. Concretely, we introduce four-valued equivalents of Bayes' and Jeffrey updating and also suggest mechanisms for aggregating information from different sources

    A computational study on the effects of the organizational structures on the risk of different types of banking groups

    Get PDF
    In this dissertation, two theoretical models are used to compare centralized and decentralized banking structures. In the first approach, the problem for both banks is to choose an expansive or restrictive credit policy without having complete knowledge of the state of the overall and local economies. Observing and appraising verifiable information (hard information) is the benefit of the centralized banks, whereas considering unverifiable information (soft information) about the local economies, so-called soft signals, is the important asset of the decentralized banks. To compare two banking systems, the risk-return trade-off method is used to determine which type of banking system might have better performance. Although the overestimation of the local economy may have a negative impact, this soft signal has a quite positive impact on risk measures in general. As a result, decentralized bank managers are better at detecting bad loans in their banks. In addition, because small banks have less bureaucracy, the borrower can obtain credit more effortlessly and swiftly. In the second approach, a theoretical bank run model based on Chari and Jagannathan (1988) is developed by implementing a cheap talk game to compare banking structures during bank shocks when managers communicate strategically with depositors to prevent non-efficient bank runs. These two banks behaved considerably differently in the local economy, and this issue is directly tied to regional culture. The limitation of punishment in the legal system incentivizes the management system in centralized banks at some point to be cunning. Consequently, based on the modified model, the higher the punishment or the lower the salary, the less likely the manager is to be persuaded to lie. On the other hand, in small banks, trust and soft information between bank management and depositors protect inefficient bank runs. A decentralized banking system can improve the financial systems credibility by mitigating undesirable shocks during times of crisis. Hence, having decentralized banks in the banking structure increases the depositors welfare.In dieser Dissertation werden zwei theoretische Modelle verwendet, um zentralisierte und dezentralisierte Bankenstrukturen zu vergleichen. Im ersten Ansatz besteht das Problem für beide Banken darin, sich für eine expansive oder restriktive Kreditpolitik zu entscheiden, ohne vollständige Kenntnisse über den Zustand der gesamten und der lokalen Wirtschaft zu haben. Die Beobachtung und Bewertung überprüfbarer Informationen (hard Informationen) ist der Vorteil der zentralisierten Banken, während die Berücksichtigung nicht überprüfbarer Informationen (soft Informationen) über die lokalen Volkswirtschaften, so genannte weiche Signale, der wichtige Vorteil der dezentralisierten Banken ist. Um zwei Bankensysteme miteinander zu vergleichen, wird die Methode des Risiko-Rendite-Trade-Offs verwendet, um festzustellen, welche Art von Bankensystem die bessere Leistung aufweist. Obwohl sich die Überschätzung der lokalen Wirtschaft negativ auswirken kann, hat dieses weiche Signal einen recht positiven Einfluss auf die Risikokennzahlen im Allgemeinen. Infolgedessen sind dezentralisierte Bankmanager besser in der Lage, faule Kredite in ihren Banken zu erkennen. Da kleine Banken weniger Bürokratie haben, kann der Kreditnehmer außerdem müheloser und schneller einen Kredit erhalten. Im zweiten Ansatz wird ein theoretisches Bank-Run-Modell auf der Grundlage von Chari und Jagannathan (1988) entwickelt, indem ein "cheap talk game" eingeführt wird, um die Bankstrukturen während eines Bankschocks zu vergleichen, wenn die Manager strategisch mit den Einlegern kommunizieren, um einen ineffizienten Bank-Run zu verhindern. Diese beiden Banken verhielten sich in der lokalen Wirtschaft sehr unterschiedlich, und dieses Problem ist direkt mit der regionalen Kultur verbunden. Die Beschränkung der Bestrafung im Rechtssystem gibt dem Managementsystem in zentralisierten Banken einen gewissen Anreiz, gerissen zu sein. Je höher die Strafe oder je niedriger das Gehalt, desto geringer ist die Wahrscheinlichkeit, dass der Manager zur Lüge überredet wird, so das modifizierte Modell. Andererseits schützen in kleinen Banken Vertrauen und weiche Informationen zwischen der Bankleitung und den Einlegern vor ineffizienten Bank-Runs. Ein dezentrales Bankensystem kann die Glaubwürdigkeit des Finanzsystems verbessern, indem es unerwünschte Schocks in Krisenzeiten abmildert. Dezentrale Banken in der Bankenstruktur erhöhen also den Wohlerhalt der Einleger

    Adams Conditioning and Likelihood Ratio Transfer Mediated Inference

    Get PDF

    Popper's Severity of Test

    Full text link

    Bayesianism And Self-Locating Beliefs

    Get PDF

    Generalized belief change with imprecise probabilities and graphical models

    Get PDF
    We provide a theoretical investigation of probabilistic belief revision in complex frameworks, under extended conditions of uncertainty, inconsistency and imprecision. We motivate our kinematical approach by specializing our discussion to probabilistic reasoning with graphical models, whose modular representation allows for efficient inference. Most results in this direction are derived from the relevant work of Chan and Darwiche (2005), that first proved the inter-reducibility of virtual and probabilistic evidence. Such forms of information, deeply distinct in their meaning, are extended to the conditional and imprecise frameworks, allowing further generalizations, e.g. to experts' qualitative assessments. Belief aggregation and iterated revision of a rational agent's belief are also explored

    What we talk about when we talk about uncertainty. Toward a unified, data-driven framework for uncertainty characterization in hydrogeology

    Get PDF
    In this manuscript, we compare and discuss different frameworks for hydrogeological uncertainty analysis. Since uncertainty is a property of knowledge, we base this comparison on purely epistemological concepts. In a detailed comparison between different candidates, we make the case for Bayesianism, i.e., the framework of reasoning about uncertainty using probability theory. We motivate the use of Bayesian tools, shortly explain the properties of Bayesian inference, prediction and decision and identify the most pressing current challenges of this framework. In hydrogeology, these challenges are the derivation of prior distributions for the parametric uncertainty, typically hydraulic conductivity values, as well as the most relevant paradigm for generating subsurface structures for assessing the structural uncertainty. We present the most commonly used paradigms and give detailed advice on two specific paradigms; Gaussian multivariate random fields as well as multiple-point statistics, both of which have benefits and drawbacks. Without settling for either of these paradigms, we identify the lack of open-access data repositories as the most pressing current impediment for the advancement of data-driven uncertainty analysis. We detail the shortcomings of the current situation and describe a number of steps which could foster the application of both the Gaussian as well as the multiple-point paradigm. We close the manuscript with a call for a community-wide initiative to create this necessary support

    Frontiers in Psychology / Imprecise Uncertain Reasoning : A Distributional Approach

    Get PDF
    The contribution proposes to model imprecise and uncertain reasoning by a mental probability logic that is based on probability distributions. It shows how distributions are combined with logical operators and how distributions propagate in inference rules. It discusses a series of examples like the Linda task, the suppression task, Doherty's pseudodiagnosticity task, and some of the deductive reasoning tasks of Rips. It demonstrates how to update distributions by soft evidence and how to represent correlated risks. The probabilities inferred from different logical inference forms may be so similar that it will be impossible to distinguish them empirically in a psychological study. Second-order distributions allow to obtain the probability distribution of being coherent. The maximum probability of being coherent is a second-order criterion of rationality. Technically the contribution relies on beta distributions, copulas, vines, and stochastic simulation.(VLID)311645

    Imprecise probability in epistemology

    Get PDF
    There is a growing interest in the foundations as well as the application of imprecise probability in contemporary epistemology. This dissertation is concerned with the application. In particular, the research presented concerns ways in which imprecise probability, i.e. sets of probability measures, may helpfully address certain philosophical problems pertaining to rational belief. The issues I consider are disagreement among epistemic peers, complete ignorance, and inductive reasoning with imprecise priors. For each of these topics, it is assumed that belief can be modeled with imprecise probability, and thus there is a non-classical solution to be given to each problem. I argue that this is the case for peer disagreement and complete ignorance. However, I discovered that the approach has its shortcomings, too, specifically in regard to inductive reasoning with imprecise priors. Nevertheless, the dissertation ultimately illustrates that imprecise probability as a model of rational belief has a lot of promise, but one should be aware of its limitations also

    Confirmation and Evidence

    Get PDF
    The question how experience acts on our beliefs and how beliefs are changed in the light of experience is one of the oldest and most controversial questions in philosophy in general and epistemology in particular. Philosophy of science has replaced this question by the more specific enquiry how results of experiments act on scientific hypotheses and theories. Why do we maintain some theories while discarding others? Two general questions emerge: First, what is our reason to accept the justifying power of experience and more specifically, scientific experiments? Second, how can the relationship between theory and evidence be described and under which circumstances is a scientific theory confirmed by a piece of evidence? The book focuses on the second question, on explicating the relationship between theory and evidence and capturing the structure of a valid inductive argument. Special attention is paid to statistical applications that are prevalent in modern empirical science. After an introductory chapter about the link between confirmation and induction, the project starts with discussing qualitative accounts of confirmation in first-order predicate logic. Two major approaches, the Hempelian satisfaction criterion and the hypothetico-deductivist tradition, are contrasted to each other. This is subsequently extended to an account of the confirmation of entire theories as opposed to the confirmation of single hypothesis. Then the quantative Bayesian account of confirmation is explained and discussed on the basis of a theory of rational degrees of belief. After that, I present the various schools of statistical inference and explain the foundations of these competing schemes. Finally, I argue for a specific concept of statistical evidence, summarize the results, and sketch some open questions. </p
    corecore