1,258 research outputs found
An introduction to lattice QCD at non--zero temperature and density
This is an informal overview of methods and results on the QCD phase diagram
and lattice termodynamics aimed at specialists in nearby fields.Comment: 15 pages; lecture at the GISELDA Meeting held in Frascati, Italy,
14-18 January 200
Channels’ Confirmation and Predictions’ Confirmation: From the Medical Test to the Raven Paradox
After long arguments between positivism and falsificationism, the verification of universal hypotheses was replaced with the confirmation of uncertain major premises. Unfortunately, Hemple proposed the Raven Paradox. Then, Carnap used the increment of logical probability as the confirmation measure. So far, many confirmation measures have been proposed. Measure F proposed by Kemeny and Oppenheim among them possesses symmetries and asymmetries proposed by Elles and Fitelson, monotonicity proposed by Greco et al., and normalizing property suggested by many researchers. Based on the semantic information theory, a measure b* similar to F is derived from the medical test. Like the likelihood ratio, measures b* and F can only indicate the quality of channels or the testing means instead of the quality of probability predictions. Furthermore, it is still not easy to use b*, F, or another measure to clarify the Raven Paradox. For this reason, measure c* similar to the correct rate is derived. Measure c* supports the Nicod Criterion and undermines the Equivalence Condition, and hence, can be used to eliminate the Raven Paradox. An example indicates that measures F and b* are helpful for diagnosing the infection of Novel Coronavirus, whereas most popular confirmation measures are not. Another example reveals that all popular confirmation measures cannot be used to explain that a black raven can confirm “Ravens are black” more strongly than a piece of chalk. Measures F, b*, and c* indicate that the existence of fewer counterexamples is more important than more positive examples’ existence, and hence, are compatible with Popper’s falsification thought
Interference Phenomena in Electronic Transport Through Chaotic Cavities: An Information-Theoretic Approach
We develop a statistical theory describing quantum-mechanical scattering of a
particle by a cavity when the geometry is such that the classical dynamics is
chaotic. This picture is relevant to a variety of systems, ranging from atomic
nuclei to microwave cavities; the main application here is to electronic
transport through ballistic microstructures. The theory describes the regime in
which there are two distinct time scales, associated with a prompt and an
equilibrated response, and is cast in terms of the matrix of scattering
amplitudes S. The prompt response is related to the energy average of S which,
through ergodicity, is expressed as the average over an ensemble of systems. We
use an information-theoretic approach: the ensemble of S-matrices is determined
by (1) general physical features-- symmetry, causality, and ergodicity, (2) the
specific energy average of S, and (3) the notion of minimum information in the
ensemble. This ensemble, known as Poisson's kernel, is meant to describe those
situations in which any other information is irrelevant. Thus, one constructs
the one-energy statistical distribution of S using only information expressible
in terms of S itself without ever invoking the underlying Hamiltonian. This
formulation has a remarkable predictive power: from the distribution of S we
derive properties of the quantum conductance of cavities, including its
average, its fluctuations, and its full distribution in certain cases, both in
the absence and presence prompt response. We obtain good agreement with the
results of the numerical solution of the Schrodinger equation for cavities in
which either prompt response is absent or there are two widely separated time
scales. Good agreement with experimental data is obtained once temperature
smearing and dephasing effects are taken into account.Comment: 38 pages, 11 ps files included, uses IOP style files and epsf.st
Introduction to Quantum-Gravity Phenomenology
After a brief review of the first phase of development of Quantum-Gravity
Phenomenology, I argue that this research line is now ready to enter a more
advanced phase: while at first it was legitimate to resort to heuristic
order-of-magnitude estimates, which were sufficient to establish that
sensitivity to Planck-scale effects can be achieved, we should now rely on
detailed analyses of some reference test theories. I illustrate this point in
the specific example of studies of Planck-scale modifications of the
energy/momentum dispersion relation, for which I consider two test theories.
Both the photon-stability analyses and the Crab-nebula synchrotron-radiation
analyses, which had raised high hopes of ``beyond-Plankian'' experimental
bounds, turn out to be rather ineffective in constraining the two test
theories. Examples of analyses which can provide constraints of rather wide
applicability are the so-called ``time-of-flight analyses'', in the context of
observations of gamma-ray bursts, and the analyses of the cosmic-ray spectrum
near the GZK scale.Comment: 46 pages, LaTex. Based on lectures given at the 40th Karpacz Winter
School in Theoretical Physic
Qualitative individuation in permutation-invariant quantum mechanics
In this article I expound an understanding of the quantum mechanics of
so-called "indistinguishable" systems in which permutation invariance is taken
as a symmetry of a special kind, namely the result of representational
redundancy. This understanding has heterodox consequences for the understanding
of the states of constituent systems in an assembly and for the notion of
entanglement. It corrects widespread misconceptions about the inter-theoretic
relations between quantum mechanics and both classical particle mechanics and
quantum field theory. The most striking of the heterodox consequences are: (i)
that fermionic states ought not always to be considered entangled; (ii) it is
possible for two fermions or two bosons to be discerned using purely monadic
quantities; and that (iii) fermions (but not bosons) may always be so
discerned.Comment: 58 pages, 5 figure
Statistical Inference and the Plethora of Probability Paradigms: A Principled Pluralism
The major competing statistical paradigms share a common remarkable but unremarked thread: in many of their inferential applications, different probability interpretations are combined. How this plays out in different theories of inference depends on the type of question asked. We distinguish four question types: confirmation, evidence, decision, and prediction. We show that Bayesian confirmation theory mixes what are intuitively “subjective” and “objective” interpretations of probability, whereas the likelihood-based account of evidence melds three conceptions of what constitutes an “objective” probability
- …