173 research outputs found
Models and Simulations in Material Science: Two Cases Without Error Bars
We discuss two research projects in material science in which the results
cannot be stated with an estimation of the error: a spectro- scopic
ellipsometry study aimed at determining the orientation of DNA molecules on
diamond and a scanning tunneling microscopy study of platinum-induced nanowires
on germanium. To investigate the reliability of the results, we apply ideas
from the philosophy of models in science. Even if the studies had reported an
error value, the trustworthiness of the result would not depend on that value
alone.Comment: 20 pages, 2 figure
Infinitesimal Probabilities
Non-Archimedean probability functions allow us to combine regularity with perfect additivity. We discuss the philosophical motivation for a particular choice of axioms for a non-Archimedean probability theory and answer some philosophical objections that have been raised against infinitesimal probabilities in general
Rationality: a social-epistemology perspective
Both in philosophy and in psychology, human rationality has traditionally been studied from an "individualistic" perspective. Recently, social epistemologists have drawn attention to the fact that epistemic interactions among agents also give rise to important questions concerning rationality. In previous work, we have used a formal model to assess the risk that a particular type of social-epistemic interactions lead agents with initially consistent belief states into inconsistent belief states. Here, we continue this work by investigating the dynamics to which these interactions may give rise in the population as a whole
The Snow White problem
The SnowWhite problem is introduced to demonstrate how learning something of which one could not have learnt the opposite (due to observer selection bias) can change an agent’s probability assignment. This helps us to analyse the Sleeping Beauty problem, which is deconstructed as a combinatorial engine and a subjective wrapper. The combinatorial engine of the problem is analogous to Bertrand’s boxes paradox and can be solved with standard probability theory. The subjective wrapper is clarified using the Snow White problem. Sample spaces for all three problems are presented. The conclusion is that subjectivity plays no irreducible role in solving the Sleeping Beauty problem and that no reference to centered worlds is required to provide the answer
Uniform probability in cosmology
Problems with uniform probabilities on an infinite support show up in
contemporary cosmology. This paper focuses on the context of inflation theory,
where it complicates the assignment of a probability measure over pocket
universes. The measure problem in cosmology, whereby it seems impossible to
pick out a uniquely well-motivated measure, is associated with a paradox that
occurs in standard probability theory and crucially involves uniformity on an
infinite sample space. This problem has been discussed by physicists, albeit
without reference to earlier work on this topic. The aim of this article is
both to introduce philosophers of probability to these recent discussions in
cosmology and to familiarize physicists and philosophers working on cosmology
with relevant foundational work by Kolmogorov, de Finetti, Jaynes, and other
probabilists. As such, the main goal is not to solve the measure problem, but
to clarify the exact origin of some of the current obstacles. The analysis of
the assumptions going into the paradox indicates that there exist multiple ways
of dealing consistently with uniform probabilities on infinite sample spaces.
Taking a pluralist stance towards the mathematical methods used in cosmology
shows there is some room for progress with assigning probabilities in
cosmological theories.Comment: 16 pages; accepted for publication in Studies in History and
Philosophy of Scienc
Degrees of riskiness, falsifiability, and truthlikeness. A neo-Popperian account applicable to probabilistic theories
In this paper, we take a fresh look at three Popperian concepts: riskiness,
falsifiability, and truthlikeness (or verisimilitude) of scientific hypotheses
or theories. First, we make explicit the dimensions that underlie the notion of
riskiness. Secondly, we examine if and how degrees of falsifiability can be
defined, and how they are related to various dimensions of the concept of
riskiness as well as the experimental context. Thirdly, we consider the
relation of riskiness to (expected degrees of) truthlikeness. Throughout, we
pay special attention to probabilistic theories and we offer a tentative,
quantitative account of verisimilitude for probabilistic theories.Comment: 41 pages; 3 figures; accepted for publication in Synthes
Degrees of freedom
Human freedom is in tension with nomological determinism and with statistical determinism. The goal of this paper is to answer both challenges. Four contributions are made to the free-will debate. First, we propose a classification of scientific theories based on how much freedom they allow. We take into account that indeterminism comes in different degrees and that both the laws and the auxiliary conditions can place constraints. A scientific worldview pulls towards one end of this classification, while libertarianism pulls towards the other end of the spectrum. Second, inspired by Hoefer, we argue that an interval of auxiliary conditions corresponds to a region in phase space, and to a bundle of possible block universes. We thus make room for a form of non-nomological indeterminism. Third, we combine crucial elements from the works of Hoefer and List; we attempt to give a libertarian reading of this combination. On our proposal, throughout spacetime, there is a certain amount of freedom (equivalent to setting the initial, intermediate, or final conditions) that can be interpreted as the result of agential choices. Fourth, we focus on the principle of alternative possibilities throughout and propose three ways of strengthening it
- …