10 research outputs found
Well posedness and physical possibility
There is a sentiment shared among physicists that well posedness is a necessary condition for physical possibility. The arguments usually offered for well posedness have an epistemic flavor and thus they fall short of establishing the metaphysical claim that lack of well posedness implies physical impossibility. In this work we analyze the relationship of well posedness to prediction and confirmation as well as the notion of physical possibility and we devise three novel and independent argumentative strategies that may succeed where the usual epistemic arguments fail
How macrostates come about?
This paper is a further consideration of Hemmo and Shenker’s (2012) ideas about the proper conceptual characterization of macrostates in statistical mechanics. We provide two formulations of how macrostates come about as elements of certain partitions of the system’s phase
space imposed on by the interaction between the system and an observer, and we show that these two formulations are mathematically equivalent. We also reflect on conceptual issues regarding the relationship of macrostates to distinguishability, thermodynamic regularity, observer
dependence, and the general phenomenon of measurement
Is it the Principal Principle that implies the Principle of Indifference?
Hawthorne, Landes, Wallmann and Williamson (2015) argue that the Principal Principle implies a version of the Principle of Indifference. We show that what the Authors take to be the Principle of Indifference can be obtained without invoking anything which would seem to be related to the Principal Principle. In the Appendix we also discuss several Conditions proposed in the same paper
How macrostates come about?
This paper is a further consideration of Hemmo and Shenker’s (2012) ideas about the proper conceptual characterization of macrostates in statistical mechanics. We provide two formulations of how macrostates come about as elements of certain partitions of the system’s phase
space imposed on by the interaction between the system and an observer, and we show that these two formulations are mathematically equivalent. We also reflect on conceptual issues regarding the relationship of macrostates to distinguishability, thermodynamic regularity, observer
dependence, and the general phenomenon of measurement
A dynamical systems approach to causation
Our approach aims at accounting for causal claims in terms of how the physical states of the underlying dynamical system evolve with time. Causal claims assert connections between two sets of physicals states - their truth depends on whether the two sets in question are genuinely connected by time evolution such that physical states from one set evolve with time into the states of the other set. We demonstrate the virtues of our approach by showing how it is able to account for typical causes, causally relevant factors, being ‘the’ cause, and cases of overdetermination and causation by absences
When can statistical theories be causally closed?
The notion of common cause closedness of a classical, Kolmogorovian probability space with respect to a causal independence relation between the random events is defined, and propositions are presented that characterize common cause closedness for specific probability spaces. It is proved in particular that no probability space with a finite number of random events can contain common causes of all the correlations it predicts; however, it is demonstrated that probability spaces even with a finite number of random events can be common cause closed with respect to a causal independence relation that is stronger than logical independence. Furthermore it is shown that infinite, atomless probability spaces are always common cause closed in the strongest possible sense. Open problems concerning common cause closedness are formulated and the results are interpreted from the perspective of Reichenbach's Common Cause Principle
Bayes rules all: On the equivalence of various forms of learning in a probabilistic setting
Jeffrey conditioning is said to provide a more general method of assimilating uncertain evidence than Bayesian conditioning. We show that Jeffrey learning is merely a particular type of Bayesian learning if we accept either of the following two observations:
– Learning comprises both probability kinematics and proposition kinematics.
– What can be updated is not the same as what can do the updating; the set of the latter is richer than the set of the former.
We address the problem of commutativity and isolate commutativity from invariance upon conditioning on conjunctions. We also present a disjunctive model of Bayesian learning which suggests that Jeffrey conditioning is better understood as providing a method for incorporating unspecified but certain evidence rather than providing a method for incorporating specific but uncertain evidence. The results also generalize over many other subjective probability update rules, such as those proposed by Field (1978) and Gallow (2014)