7,405 research outputs found
Where would we be without counterfactuals?
Huw Price gives his inaugural lecture as Bertrand Russell Professor of Philosophy. Bertrand Russellâs celebrated essay âOn the Notion of Causeâ was first delivered to the Aristotelian Society on 4 November 1912, as Russellâs Presidential Address. The piece is best known for a passage in which its author deftly positions himself between the traditional metaphysics of causation and the British crown, firing broadsides in both directions: âThe law of causalityâ, Russell declares, âLike much that passes muster in philosophy, is a relic of a bygone age, surviving, like the monarchy, only because it is erroneously supposed to do no harm.â To mark the lectureâs centenary, we offer a contemporary view of the issues Russell here puts on the table, and of the health or otherwise, at the end of the essayâs first century, of his notorious conclusion
On the nature of continuous physical quantities in classical and quantum mechanics
Within the traditional Hilbert space formalism of quantum mechanics, it is
not possible to describe a particle as possessing, simultaneously, a sharp
position value and a sharp momentum value. Is it possible, though, to describe
a particle as possessing just a sharp position value (or just a sharp momentum
value)? Some, such as Teller (Journal of Philosophy, 1979), have thought that
the answer to this question is No -- that the status of individual continuous
quantities is very different in quantum mechanics than in classical mechanics.
On the contrary, I shall show that the same subtle issues arise with respect to
continuous quantities in classical and quantum mechanics; and that it is, after
all, possible to describe a particle as possessing a sharp position value
without altering the standard formalism of quantum mechanics.Comment: 26 pages, LaTe
A Decidable Multi-agent Logic for Reasoning About Actions, Instruments, and Norms
We formally introduce a novel, yet ubiquitous, category of norms: norms of instrumentality. Norms of this category describe which actions are obligatory, or prohibited, as instruments for certain purposes. We propose the Logic of Agency and Norms (LAN) that enables reasoning about actions, instrumentality, and normative principles in a multi-agent setting. Leveraging LAN , we formalize norms of instrumentality and compare them to two prevalent norm categories: norms to be and norms to do. Last, we pose principles relating the three categories and evaluate their validity vis-Ă -vis notions of deliberative acting. On a technical note, the logic will be shown decidable via the finite model property
Radiative Muon Capture on Hydrogen and the Induced Pseudoscalar Coupling
The first measurement of the elementary process is reported. A photon pair spectrometer was used to measure
the partial branching ratio ( for photons of k >
60 MeV. The value of the weak pseudoscalar coupling constant determined from
the partial branching ratio is , where the first error is the quadrature sum of statistical
and systematic uncertainties and the second error is due to the uncertainty in
, the decay rate of the ortho to para molecule. This
value of g_p is 1.5 times the prediction of PCAC and pion-pole dominance.Comment: 13 pages, RevTeX type, 3 figures (encapsulated postscript), submitted
to Phys. Rev. Let
MLP: a MATLAB toolbox for rapid and reliable auditory threshold estimation
In this paper, we present MLP, a MATLAB toolbox enabling auditory
thresholds estimation via the adaptive Maximum Likelihood procedure proposed
by David Green (1990, 1993). This adaptive procedure is particularly appealing for
those psychologists that need to estimate thresholds with a good degree of accuracy
and in a short time. Together with a description of the toolbox, the current text
provides an introduction to the threshold estimation theory and a theoretical
explanation of the maximum likelihood adaptive procedure. MLP comes with a
graphical interface and it is provided with several built-in, classic psychoacoustics
experiments ready to use at a mouse click
3D Reconstruction for Partial Data Electrical Impedance Tomography Using a Sparsity Prior
In electrical impedance tomography the electrical conductivity inside a
physical body is computed from electro-static boundary measurements. The focus
of this paper is to extend recent result for the 2D problem to 3D. Prior
information about the sparsity and spatial distribution of the conductivity is
used to improve reconstructions for the partial data problem with Cauchy data
measured only on a subset of the boundary. A sparsity prior is enforced using
the norm in the penalty term of a Tikhonov functional, and spatial
prior information is incorporated by applying a spatially distributed
regularization parameter. The optimization problem is solved numerically using
a generalized conditional gradient method with soft thresholding. Numerical
examples show the effectiveness of the suggested method even for the partial
data problem with measurements affected by noise.Comment: 10 pages, 3 figures. arXiv admin note: substantial text overlap with
arXiv:1405.655
The open future, bivalence and assertion
It is highly intuitive that the future is open and the past is closedâwhereas it is unsettled whether there will be a fourth world war, it is settled that there was a first. Recently, it has become increasingly popular to claim that the intuitive openness of the future implies that contingent statements about the future, such as âthere will be a sea battle tomorrow,â are non-bivalent (neither true nor false). In this paper, we argue that the non-bivalence of future contingents is at odds with our pre-theoretic intuitions about the openness of the future. These are revealed by our pragmatic judgments concerning the correctness and incorrectness of assertions of future contingents. We argue that the pragmatic data together with a plausible account of assertion shows that in many cases we take future contingents to be true (or to be false), though we take the future to be open in relevant respects. It follows that appeals to intuition to support the non-bivalence of future contingents is untenable. Intuition favours bivalence
Constraining Antimatter Domains in the Early Universe with Big Bang Nucleosynthesis
We consider the effect of a small-scale matter-antimatter domain structure on
big bang nucleosynthesis and place upper limits on the amount of antimatter in
the early universe. For small domains, which annihilate before nucleosynthesis,
this limit comes from underproduction of He-4. For larger domains, the limit
comes from He-3 overproduction. Most of the He-3 from antiproton-helium
annihilation is annihilated also. The main source of He-3 is
photodisintegration of He-4 by the electromagnetic cascades initiated by the
annihilation.Comment: 4 pages, 2 figures, revtex, (slightly shortened
- âŠ