560 research outputs found
Only A Daisy Blossom : Waltz Song
https://digitalcommons.library.umaine.edu/mmb-vp/4357/thumbnail.jp
Accuracy, conditionalization, and probabilism
Accuracy-based arguments for conditionalization and probabilism appear to have a significant advantage over their Dutch Book rivals. They rely only on the plausible epistemic norm that one should try to decrease the inaccuracy of one's beliefs. Furthermore, conditionalization and probabilism apparently follow from a wide range of measures of inaccuracy. However, we argue that there is an under-appreciated diachronic constraint on measures of inaccuracy which limits the measures from which one can prove conditionalization, and none of the remaining measures allow one to prove probabilism. That is, among the measures in the literature, there are some from which one can prove conditionalization, others from which one can prove probabilism, but none from which one can prove both. Hence at present, the accuracy-based approach cannot underwrite both conditionalization and probabilism
Toward a Formal Analysis of Deceptive Signaling
Deception has long been an important topic in philosophy (see Augustine 1952; Kant 1996; Chisholm & Feehan 1977; Mahon 2007; Carson 2010). However, the traditional analysis of the concept, which requires that a deceiver intentionally cause her victim to have a false belief, rules out the possibility of much deception in the animal kingdom. Cognitively unsophisticated species, such as fireflies and butterflies, have simply evolved to mislead potential predators and/or prey. To capture such cases of “functional deception,” several researchers (e.g., Sober 1994; Hauser 1997; Searcy & Nowicki 2005, Skyrms 2010) have endorsed the broader view that deception only requires that a deceiver benefit from sending a misleading signal. Moreover, in order to facilitate game-theoretic study of deception in the context of Lewisian sender-receiver games, Brian Skyrms has proposed an influential formal analysis of this view. Such formal analyses have the potential to enhance our philosophical understanding of deception in humans as well as animals. However, as we argue in this paper, Skyrms's analysis, as well as two recently proposed alternative analyses (viz., Godfrey-Smith 2011; McWhirter 2016), are seriously flawed and can lead us to draw unwarranted conclusions about deception
Accuracy, conditionalization, and probabilism
Accuracy-based arguments for conditionalization and probabilism appear to have a significant advantage over their Dutch Book rivals. They rely only on the plausible epistemic norm that one should try to decrease the inaccuracy of one's beliefs. Furthermore, it seems that conditionalization and probabilism follow from a wide range of measures of inaccuracy. However, we argue that among the measures in the literature, there are some from which one can prove conditionalization, others from which one can prove probabilism, and none from which one can prove both. Hence at present, the accuracy-based approach cannot underwrite both conditionalization and probabilism
Toward a Formal Analysis of Deceptive Signaling
Deception has long been an important topic in philosophy (see Augustine 1952; Kant 1996; Chisholm & Feehan 1977; Mahon 2007; Carson 2010). However, the traditional analysis of the concept, which requires that a deceiver intentionally cause her victim to have a false belief, rules out the possibility of much deception in the animal kingdom. Cognitively unsophisticated species, such as fireflies and butterflies, have simply evolved to mislead potential predators and/or prey. To capture such cases of “functional deception,” several researchers (e.g., Sober 1994; Hauser 1997; Searcy & Nowicki 2005, Skyrms 2010) have endorsed the broader view that deception only requires that a deceiver benefit from sending a misleading signal. Moreover, in order to facilitate game-theoretic study of deception in the context of Lewisian sender-receiver games, Brian Skyrms has proposed an influential formal analysis of this view. Such formal analyses have the potential to enhance our philosophical understanding of deception in humans as well as animals. However, as we argue in this paper, Skyrms's analysis, as well as two recently proposed alternative analyses (viz., Godfrey-Smith 2011; McWhirter 2016), are seriously flawed and can lead us to draw unwarranted conclusions about deception
Accuracy, conditionalization, and probabilism
Accuracy-based arguments for conditionalization and probabilism appear to have a significant advantage over their Dutch Book rivals. They rely only on the plausible epistemic norm that one should try to decrease the inaccuracy of one's beliefs. Furthermore, conditionalization and probabilism apparently follow from a wide range of measures of inaccuracy. However, we argue that there is an under-appreciated diachronic constraint on measures of inaccuracy which limits the measures from which one can prove conditionalization, and none of the remaining measures allow one to prove probabilism. That is, among the measures in the literature, there are some from which one can prove conditionalization, others from which one can prove probabilism, but none from which one can prove both. Hence at present, the accuracy-based approach cannot underwrite both conditionalization and probabilism
Design and commissioning of a timestamp-based data acquisition system for the DRAGON recoil mass separator
The DRAGON recoil mass separator at TRIUMF exists to study radiative proton
and alpha capture reactions, which are important in a variety of astrophysical
scenarios. DRAGON experiments require a data acquisition system that can be
triggered on either reaction product ( ray or heavy ion), with the
additional requirement of being able to promptly recognize coincidence events
in an online environment. To this end, we have designed and implemented a new
data acquisition system for DRAGON which consists of two independently
triggered readouts. Events from both systems are recorded with timestamps from
a MHz clock that are used to tag coincidences in the earliest possible
stage of the data analysis. Here we report on the design, implementation, and
commissioning of the new DRAGON data acquisition system, including the
hardware, trigger logic, coincidence reconstruction algorithm, and live time
considerations. We also discuss the results of an experiment commissioning the
new system, which measured the strength of the
keV resonance in the NeNa radiative proton
capture reaction.Comment: 11 pages, 7 figures, accepted for publication in EPJ A "tools for
experiment and theory
Recommended from our members
Improving the Acceptance of Isolated Elementary School Children
The purposes of this study were: (1) to develop a program based upon a combination of previously tested techniques, (2) to adapt these techniques for use by school personnel within the classroom situation, (3) to test this program upon an all-black, multi-age kindergarten and a first-grade classroom in an inner-city school, and (4) to evaluate the effectiveness of this program
- …