66 research outputs found
Second Order Inductive Logic and Wilmers' Principle
We extend the framework of Inductive Logic to Second Order languages and introduce Wilmers' Principle, a rational principle for probability functions on Second Order languages. We derive a representation theorem for functions satisfying this principle and investigate its relationship to the first order principles of Regularity and Super Regularity
DELIBERATION, JUDGEMENT AND THE NATURE OF EVIDENCE
A normative Bayesian theory of deliberation and judgement requires a procedure for merging the evidence of a collection of agents. In order to provide such a procedure, one needs to ask what the evidence is that grounds Bayesian probabilities. After finding fault with several views on the nature of evidence (the views that evidence is knowledge; that evidence is whatever is fully believed; that evidence is observationally set credence; that evidence is information), it is argued that evidence is whatever is rationally taken for granted. This view is shown to have consequences for an account of merging evidence, and it is argued that standard axioms for merging need to be altered somewhat
The Counterpart Principle of Analogical Support by Structural Similarity
We propose and investigate an Analogy Principle in the context of Unary Inductive Logic based on a notion of support by structural similarity which is often employed to motivate scientific conjectures
An Analysis of Tennenbaum's Theorem in Constructive Type Theory
Tennenbaum's theorem states that the only countable model of Peano arithmetic
(PA) with computable arithmetical operations is the standard model of natural
numbers. In this paper, we use constructive type theory as a framework to
revisit, analyze and generalize this result. The chosen framework allows for a
synthetic approach to computability theory, exploiting that, externally, all
functions definable in constructive type theory can be shown computable. We
then build on this viewpoint and furthermore internalize it by assuming a
version of Church's thesis, which expresses that any function on natural
numbers is representable by a formula in PA. This assumption provides for a
conveniently abstract setup to carry out rigorous computability arguments, even
in the theorem's mechanization. Concretely, we constructivize several classical
proofs and present one inherently constructive rendering of Tennenbaum's
theorem, all following arguments from the literature. Concerning the classical
proofs in particular, the constructive setting allows us to highlight
differences in their assumptions and conclusions which are not visible
classically. All versions are accompanied by a unified mechanization in the Coq
proof assistant.Comment: 23 pages, extension of conference paper published at FSCD 202
Sets and Probability
In this article the idea of random variables over the set theoretic universe
is investigated. We explore what it can mean for a random set to have a
specific probability of belonging to an antecedently given class of sets
A natural prior probability distribution derived from the propositional calculus
AbstractA σ-additive probability measure on the real interval [0, 1] is defined by considering the expected values of “randomly chosen” large formulae of the propositional calculus, where the propositional variables are treated as independent random variables on {0, 1} with expected value 12. Although arising naturally from logical and/or cognitive considerations, this measure is extremely complex and displays certain formally pathological features, including infinite density at all points of a certain dense subset of [0, 1]. Certain variantsof the construction are also considered. The introduction includes an account of motivation for the study of such measures arising from a fundamental problem in inexact reasoning
- …