6,173 research outputs found
A critical analysis of Popper's experiment
An experiment which could decide against the Copenhagen interpretation of
quantum mechanics has been proposed by K. Popper and, subsequently, it has been
criticized by M.J. Collett and R. Loudon. Here we show that both the above
mentioned arguments are not correct because they are based on a misuse of basic
quantum rules.Comment: 12 pages, 3 figures, RevTex; to be published on PR
The effects of multiple aerospace environmental stressors on human performance
An extended Fitt's law paradigm reaction time (RT) task was used to evaluate the effects of acceleration on human performance in the Dynamic Environment Simulator (DES) at Armstrong Laboratory, Wright-Patterson AFB, Ohio. This effort was combined with an evaluation of the standard CSU-13 P anti-gravity suit versus three configurations of a 'retrograde inflation anti-G suit'. Results indicated that RT and error rates increased 17 percent and 14 percent respectively from baseline to the end of the simulated aerial combat maneuver and that the most common error was pressing too few buttons
Content & Watkins's account of natural axiomatizations
This paper briefly recounts the importance of the notion of natural axiomatizations for explicating hypothetico-deductivism, empirical significance, theoretical reduction, and organic fertility. Problems for the account of natural axiomatizations developed by John Watkins in Science and Scepticism and the revised account developed by Elie Zahar are demonstrated. It is then shown that Watkins's account can be salvaged from various counter-examples in a principled way by adding the demand that every axiom of a natural axiomatization should be part of the content of the theory being axiomatized. The crucial point here is that content cannot simply be identified with the set of logical consequences of a theory, but must be restricted to a proper subset of the consequence set. It is concluded that the revised Watkins account has certain advantages over the account of natural axiomatizations offered in Gemes (1993)
Bayes and health care research.
Bayes’ rule shows how one might rationally change one’s beliefs in the light of evidence. It is the foundation of a statistical method called Bayesianism. In health care research, Bayesianism has its advocates but the dominant statistical method is frequentism.
There are at least two important philosophical differences between these methods. First, Bayesianism takes a subjectivist view of probability (i.e. that probability scores are statements of subjective belief, not objective fact) whilst frequentism takes an objectivist view. Second, Bayesianism is explicitly inductive (i.e. it shows how we may induce views about the world based on partial data from it) whereas frequentism is at least compatible with non-inductive views of scientific method, particularly the critical realism of Popper.
Popper and others detail significant problems with induction. Frequentism’s apparent ability to avoid these, plus its ability to give a seemingly more scientific and objective take on probability, lies behind its philosophical appeal to health care researchers.
However, there are also significant problems with frequentism, particularly its inability to assign probability scores to single events. Popper thus proposed an alternative objectivist view of probability, called propensity theory, which he allies to a theory of corroboration; but this too has significant problems, in particular, it may not successfully avoid induction. If this is so then Bayesianism might be philosophically the strongest of the statistical approaches. The article sets out a number of its philosophical and methodological attractions. Finally, it outlines a way in which critical realism and Bayesianism might work together.
</p
CLÀSSICS DEL PENSAMENT I HUMANITATS MÈDIQUES. L'actitud crítica en medicina: la necessitat d'una nova ètica
A perspective on the landscape problem
I discuss the historical roots of the landscape problem and propose criteria
for its successful resolution. This provides a perspective to evaluate the
possibility to solve it in several of the speculative cosmological scenarios
under study including eternal inflation, cosmological natural selection and
cyclic cosmologies.Comment: Invited contribution for a special issue of Foundations of Physics
titled: Forty Years Of String Theory: Reflecting On the Foundations. 31
pages, no figure
Absolute dimensions of eclipsing binaries. XXVIII. BK Pegasi and other F-type binaries: Prospects for calibration of convective core overshoot
We present a detailed study of the F-type detached eclipsing binary BK Peg,
based on new photometric and spectroscopic observations. The two components,
which have evolved to the upper half of the main-sequence band, are quite
different with masses and radii of (1.414 +/- 0.007 Msun, 1.988 +/- 0.008 Rsun)
and (1.257 +/- 0.005 Msun, 1.474 +/- 0.017 Rsun), respectively. The 5.49 day
period orbit of BK Peg is slightly eccentric (e = 0.053). The measured
rotational velocities are 16.6 +/- 0.2 (primary) and 13.4 +/- 0.2 (secondary)
km/s. For the secondary component this corresponds to (pseudo)synchronous
rotation, whereas the primary component seems to rotate at a slightly lower
rate. We derive an iron abundance of [Fe/H] =-0.12 +/- 0.07 and similar
abundances for Si, Ca, Sc, Ti, Cr and Ni. Yonsei-Yale and Victoria-Regina
evolutionary models for the observed metal abundance reproduce BK Peg at ages
of 2.75 and 2.50 Gyr, respectively, but tend to predict a lower age for the
more massive primary component than for the secondary. We find the same age
trend for three other upper main-sequence systems in a sample of well studied
eclipsing binaries with components in the 1.15-1.70 Msun range, where
convective core overshoot is gradually ramped up in the models. We also find
that the Yonsei-Yale models systematically predict higher ages than the
Victoria-Regina models. The sample includes BW Aqr, and as a supplement we have
determined a [Fe/H] abundance of -0.07 +/- 0.11 for this late F-type binary. We
propose to use BK Peg, BW Aqr, and other well-studied 1.15-1.70 Msun eclipsing
binaries to fine-tune convective core overshoot, diffusion, and possibly other
ingredients of modern theoretical evolutionary models.Comment: Accepted for publication in Astronomy and Astrophysic
Quantum erasure within the Optical Stern-Gerlach Model
In the optical Stern-Gerlach effect the two branches in which the incoming
atomic packet splits up can display interference pattern outside the cavity
when a field measurement is made which erases the which-way information on the
quantum paths the system can follow. On the contrary, the mere possibility to
acquire this information causes a decoherence effect which cancels out the
interference pattern. A phase space analysis is also carried out to investigate
on the negativity of the Wigner function and on the connection between its
covariance matrix and the distinguishability of the quantum paths.Comment: 7 pages, 3 figure
Advancing Science with VGI: Reproducibility and Replicability of Recent Studies using VGI
In scientific research, reproducibility and replicability are requirements to ensure the advancement of our
body of knowledge.
T
his holds true also for VGI
-
related research and studies. However, the
characteristics
of VGI suggest particular difficulties in
ensuring
reproducibility and replicability
. In this
paper,
we aim to examine the current situation in VGI
-
related research
,
and identify strategies to ensure
realization of its full potential. To do so, we first
investigate
the different aspects of reprod
ucibility and
replicability
and their impact on
VGI
-
related research
. These impacts are different depending on the
objectives
of the study. Therefore
, we examine the
study
focus of VGI
-
related research to assess the
current body of research
and structure o
ur assessment
. Th
is work is
based
on a rigorous review of the
elements of reproducibility and a systematic mapping and analysis
of
58
papers on the use of VGI in the
crisis management field. Results of our investigation show that reproducibility issues related to data are
a
serious
concern
, while reproducibility issues related to analysis methods and processes face fewer
challenges. Howe
ver, since most studies still focus on
analyzing
the source data, reproducibility and
replicability are
still an unsolved problem
in VGI
-
related research. Therefore, we
show initiative
s
tackling
the problem, and
finally formulate strategies to improve the
situatio
Is the quantum world composed of propensitons?
In this paper I outline my propensiton version of quantum theory (PQT). PQT is a fully micro-realistic version of quantum theory that provides us with a very natural possible solution to the fundamental wave/particle problem, and is free of the severe defects of orthodox quantum theory (OQT) as a result. PQT makes sense of the quantum world. PQT recovers all the empirical success of OQT and is, furthermore, empirically testable (although not as yet tested). I argue that Einstein almost put forward this version of quantum theory in 1916/17 in his papers on spontaneous and induced radiative transitions, but retreated from doing so because he disliked the probabilistic character of the idea. Subsequently, the idea was overlooked because debates about quantum theory polarised into the Bohr/Heisenberg camp, which argued for the abandonment of realism and determinism, and the Einstein/Schrödinger camp, which argued for the retention of realism and determinism, no one, as a result, pursuing the most obvious option of retaining realism but abandoning determinism. It is this third, overlooked option that leads to PQT. PQT has implications for quantum field theory, the standard model, string theory, and cosmology. The really important point, however, is that it is experimentally testable. I indicate two experiments in principle capable of deciding between PQT and OQT
- …
