4,466 research outputs found
A critical analysis of Popper's experiment
An experiment which could decide against the Copenhagen interpretation of
quantum mechanics has been proposed by K. Popper and, subsequently, it has been
criticized by M.J. Collett and R. Loudon. Here we show that both the above
mentioned arguments are not correct because they are based on a misuse of basic
quantum rules.Comment: 12 pages, 3 figures, RevTex; to be published on PR
Bayes and health care research.
Bayesâ rule shows how one might rationally change oneâs beliefs in the light of evidence. It is the foundation of a statistical method called Bayesianism. In health care research, Bayesianism has its advocates but the dominant statistical method is frequentism.
There are at least two important philosophical differences between these methods. First, Bayesianism takes a subjectivist view of probability (i.e. that probability scores are statements of subjective belief, not objective fact) whilst frequentism takes an objectivist view. Second, Bayesianism is explicitly inductive (i.e. it shows how we may induce views about the world based on partial data from it) whereas frequentism is at least compatible with non-inductive views of scientific method, particularly the critical realism of Popper.
Popper and others detail significant problems with induction. Frequentismâs apparent ability to avoid these, plus its ability to give a seemingly more scientific and objective take on probability, lies behind its philosophical appeal to health care researchers.
However, there are also significant problems with frequentism, particularly its inability to assign probability scores to single events. Popper thus proposed an alternative objectivist view of probability, called propensity theory, which he allies to a theory of corroboration; but this too has significant problems, in particular, it may not successfully avoid induction. If this is so then Bayesianism might be philosophically the strongest of the statistical approaches. The article sets out a number of its philosophical and methodological attractions. Finally, it outlines a way in which critical realism and Bayesianism might work together.
</p
Content & Watkins's account of natural axiomatizations
This paper briefly recounts the importance of the notion of natural axiomatizations for explicating hypothetico-deductivism, empirical significance, theoretical reduction, and organic fertility. Problems for the account of natural axiomatizations developed by John Watkins in Science and Scepticism and the revised account developed by Elie Zahar are demonstrated. It is then shown that Watkins's account can be salvaged from various counter-examples in a principled way by adding the demand that every axiom of a natural axiomatization should be part of the content of the theory being axiomatized. The crucial point here is that content cannot simply be identified with the set of logical consequences of a theory, but must be restricted to a proper subset of the consequence set. It is concluded that the revised Watkins account has certain advantages over the account of natural axiomatizations offered in Gemes (1993)
Distinct Quantum States Can Be Compatible with a Single State of Reality
Perhaps the quantum state represents information about reality, and not
reality directly. Wave function collapse is then possibly no more mysterious
than a Bayesian update of a probability distribution given new data. We
consider models for quantum systems with measurement outcomes determined by an
underlying physical state of the system but where several quantum states are
consistent with a single underlying state---i.e., probability distributions for
distinct quantum states overlap. Significantly, we demonstrate by example that
additional assumptions are always necessary to rule out such a model.Comment: 5 pages, 2 figure
The Science of Phylogenetic Systematics: Explanation, Prediction, and Test
Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/73926/1/j.1096-0031.1999.tb00279.x.pd
A perspective on the landscape problem
I discuss the historical roots of the landscape problem and propose criteria
for its successful resolution. This provides a perspective to evaluate the
possibility to solve it in several of the speculative cosmological scenarios
under study including eternal inflation, cosmological natural selection and
cyclic cosmologies.Comment: Invited contribution for a special issue of Foundations of Physics
titled: Forty Years Of String Theory: Reflecting On the Foundations. 31
pages, no figure
Quantum erasure within the Optical Stern-Gerlach Model
In the optical Stern-Gerlach effect the two branches in which the incoming
atomic packet splits up can display interference pattern outside the cavity
when a field measurement is made which erases the which-way information on the
quantum paths the system can follow. On the contrary, the mere possibility to
acquire this information causes a decoherence effect which cancels out the
interference pattern. A phase space analysis is also carried out to investigate
on the negativity of the Wigner function and on the connection between its
covariance matrix and the distinguishability of the quantum paths.Comment: 7 pages, 3 figure
Encouraging versatile thinking in algebra using the computer
In this article we formulate and analyse some of the obstacles to understanding the notion of a variable, and the use and meaning of algebraic notation, and report empirical evidence to support the hypothesis that an approach using the computer will be more successful in overcoming these obstacles. The computer approach is formulated within a wider framework ofversatile thinking in which global, holistic processing complements local, sequential processing. This is done through a combination of programming in BASIC, physical activities which simulate computer storage and manipulation of variables, and specific software which evaluates expressions in standard mathematical notation. The software is designed to enable the user to explore examples and non-examples of a concept, in this case equivalent and non-equivalent expressions. We call such a piece of software ageneric organizer because if offers examples and non-examples which may be seen not just in specific terms, but as typical, or generic, examples of the algebraic processes, assisting the pupil in the difficult task of abstracting the more general concept which they represent. Empirical evidence from several related studies shows that such an approach significantly improves the understanding of higher order concepts in algebra, and that any initial loss in manipulative facility through lack of practice is more than made up at a later stage
The density matrix in the de Broglie-Bohm approach
If the density matrix is treated as an objective description of individual
systems, it may become possible to attribute the same objective significance to
statistical mechanical properties, such as entropy or temperature, as to
properties such as mass or energy. It is shown that the de Broglie-Bohm
interpretation of quantum theory can be consistently applied to density
matrices as a description of individual systems. The resultant trajectories are
examined for the case of the delayed choice interferometer, for which Bell
appears to suggest that such an interpretation is not possible. Bell's argument
is shown to be based upon a different understanding of the density matrix to
that proposed here.Comment: 15 pages, 4 figure
How do I know what my theory predicts?
To get evidence for or against a theory relative to the null hypothesis, one needs to know what the theory predicts. The amount of evidence can then be quantified by a Bayes factor. Specifying the sizes of the effect oneâs theory predicts may not come naturally, but I show some ways of thinking about the problem, some simple heuristics that are often useful when one has little relevant prior information. These heuristics include the room-to-move heuristic (for comparing mean differences), the ratio-of-scales heuristic (for regression slopes), the ratio-of-means heuristic (for regression slopes), the basic-effect heuristic (for analysis of variance effects), and the total-effect heuristic (for mediation analysis)
- âŠ