3,172 research outputs found
Precise Estimation of Cosmological Parameters Using a More Accurate Likelihood Function
The estimation of cosmological parameters from a given data set requires a
construction of a likelihood function which, in general, has a complicated
functional form. We adopt a Gaussian copula and constructed a copula likelihood
function for the convergence power spectrum from a weak lensing survey. We show
that the parameter estimation based on the Gaussian likelihood erroneously
introduces a systematic shift in the confidence region, in particular for a
parameter of the dark energy equation of state w. Thus, the copula likelihood
should be used in future cosmological observations.Comment: 5 pages, 3 figures. Maches version published by the Physical Review
Letter
Strong Approximation of Empirical Copula Processes by Gaussian Processes
We provide the strong approximation of empirical copula processes by a
Gaussian process. In addition we establish a strong approximation of the
smoothed empirical copula processes and a law of iterated logarithm
Does a computer have an arrow of time?
In [Sch05a], it is argued that Boltzmann's intuition, that the psychological arrow of time is necessarily aligned with the thermodynamic arrow, is correct. Schulman gives an explicit physical mechanism for this connection, based on the brain being representable as a computer, together with certain thermodynamic properties of computational processes. [Haw94] presents similar, if briefer, arguments. The purpose of this paper is to critically examine the support for the link between thermodynamics and an arrow of time for computers. The principal arguments put forward by Schulman and Hawking will be shown to fail. It will be shown that any computational process that can take place in an entropy increasing universe, can equally take place in an entropy decreasing universe. This conclusion does not automatically imply a psychological arrow can run counter to the thermodynamic arrow. Some alternative possible explanations for the alignment of the two arrows will be briefly discussed
Bayes and health care research.
Bayes’ rule shows how one might rationally change one’s beliefs in the light of evidence. It is the foundation of a statistical method called Bayesianism. In health care research, Bayesianism has its advocates but the dominant statistical method is frequentism.
There are at least two important philosophical differences between these methods. First, Bayesianism takes a subjectivist view of probability (i.e. that probability scores are statements of subjective belief, not objective fact) whilst frequentism takes an objectivist view. Second, Bayesianism is explicitly inductive (i.e. it shows how we may induce views about the world based on partial data from it) whereas frequentism is at least compatible with non-inductive views of scientific method, particularly the critical realism of Popper.
Popper and others detail significant problems with induction. Frequentism’s apparent ability to avoid these, plus its ability to give a seemingly more scientific and objective take on probability, lies behind its philosophical appeal to health care researchers.
However, there are also significant problems with frequentism, particularly its inability to assign probability scores to single events. Popper thus proposed an alternative objectivist view of probability, called propensity theory, which he allies to a theory of corroboration; but this too has significant problems, in particular, it may not successfully avoid induction. If this is so then Bayesianism might be philosophically the strongest of the statistical approaches. The article sets out a number of its philosophical and methodological attractions. Finally, it outlines a way in which critical realism and Bayesianism might work together.
</p
Consistent thermodynamics for spin echoes
Spin-echo experiments are often said to constitute an instant of
anti-thermodynamic behavior in a concrete physical system that violates the
second law of thermodynamics. We argue that a proper thermodynamic treatment of
the effect should take into account the correlations between the spin and
translational degrees of freedom of the molecules. To this end, we construct an
entropy functional using Boltzmann macrostates that incorporates both spin and
translational degrees of freedom. With this definition there is nothing special
in the thermodynamics of spin echoes: dephasing corresponds to Hamiltonian
evolution and leaves the entropy unchanged; dissipation increases the entropy.
In particular, there is no phase of entropy decrease in the echo. We also
discuss the definition of macrostates from the underlying quantum theory and we
show that the decay of net magnetization provides a faithful measure of entropy
change.Comment: 15 pages, 2 figs. Changed figures, version to appear in PR
An Alternative Interpretation of Statistical Mechanics
In this paper I propose an interpretation of classical statistical mechanics that centers on taking seriously the idea that probability measures represent complete states of statistical mechanical systems. I show how this leads naturally to the idea that the stochasticity of statistical mechanics is associated directly with the observables of the theory rather than with the microstates (as traditional accounts would have it). The usual assumption that microstates are representationally significant in the theory is therefore dispensable, a consequence which suggests interesting possibilities for developing non-equilibrium statistical mechanics and investigating inter-theoretic answers to the foundational questions of statistical mechanics
Background-Independence
Intuitively speaking, a classical field theory is background-independent if
the structure required to make sense of its equations is itself subject to
dynamical evolution, rather than being imposed ab initio. The aim of this paper
is to provide an explication of this intuitive notion. Background-independence
is not a not formal property of theories: the question whether a theory is
background-independent depends upon how the theory is interpreted. Under the
approach proposed here, a theory is fully background-independent relative to an
interpretation if each physical possibility corresponds to a distinct spacetime
geometry; and it falls short of full background-independence to the extent that
this condition fails.Comment: Forthcoming in General Relativity and Gravitatio
A mixed effect model for bivariate meta-analysis of diagnostic test accuracy studies using a copula representation of the random effects distribution
Diagnostic test accuracy studies typically report the number of true positives, false positives, true negatives and false negatives. There usually exists a negative association between the number of true positives and true negatives, because studies that adopt less stringent criterion for declaring a test positive invoke higher sensitivities and lower specificities. A generalized linear mixed model (GLMM) is currently recommended to synthesize diagnostic test accuracy studies. We propose a copula mixed model for bivariate meta-analysis of diagnostic test accuracy studies. Our general model includes the GLMM as a special case and can also operate on the original scale of sensitivity and specificity. Summary receiver operating characteristic curves are deduced for the proposed model through quantile regression techniques and different characterizations of the bivariate random effects distribution. Our general methodology is demonstrated with an extensive simulation study and illustrated by re-analysing the data of two published meta-analyses. Our study suggests that there can be an improvement on GLMM in fit to data and makes the argument for moving to copula random effects models. Our modelling framework is implemented in the package CopulaREMADA within the open source statistical environment R
Basins of attraction on random topography
We investigate the consequences of fluid flowing on a continuous surface upon
the geometric and statistical distribution of the flow. We find that the
ability of a surface to collect water by its mere geometrical shape is
proportional to the curvature of the contour line divided by the local slope.
Consequently, rivers tend to lie in locations of high curvature and flat
slopes. Gaussian surfaces are introduced as a model of random topography. For
Gaussian surfaces the relation between convergence and slope is obtained
analytically. The convergence of flow lines correlates positively with drainage
area, so that lower slopes are associated with larger basins. As a consequence,
we explain the observed relation between the local slope of a landscape and the
area of the drainage basin geometrically. To some extent, the slope-area
relation comes about not because of fluvial erosion of the landscape, but
because of the way rivers choose their path. Our results are supported by
numerically generated surfaces as well as by real landscapes
- …
