2,410 research outputs found
Analytic Continuation of Quantum Monte Carlo Data by Stochastic Analytical Inference
We present an algorithm for the analytic continuation of imaginary-time
quantum Monte Carlo data which is strictly based on principles of Bayesian
statistical inference. Within this framework we are able to obtain an explicit
expression for the calculation of a weighted average over possible energy
spectra, which can be evaluated by standard Monte Carlo simulations, yielding
as by-product also the distribution function as function of the regularization
parameter. Our algorithm thus avoids the usual ad-hoc assumptions introduced in
similar algortihms to fix the regularization parameter. We apply the algorithm
to imaginary-time quantum Monte Carlo data and compare the resulting energy
spectra with those from a standard maximum entropy calculation
Relativistic particle acceleration in developing Alfv\'{e}n turbulence
A new particle acceleration process in a developing Alfv\'{e}n turbulence in
the course of successive parametric instabilities of a relativistic pair plasma
is investigated by utilyzing one-dimensional electromagnetic full particle
code. Coherent wave-particle interactions result in efficient particle
acceleration leading to a power-law like energy distribution function. In the
simulation high energy particles having large relativistic masses are
preferentially accelerated as the turbulence spectrum evolves in time. Main
acceleration mechanism is simultaneous relativistic resonance between a
particle and two different waves. An analytical expression of maximum
attainable energy in such wave-particle interactions is derived.Comment: 15 pages, 9 figures, 1 tabl
Business organisations and the legitimation of market inequalities
High levels of economic inequality are associated with a wide range of negative social outcomes. Public opinion surveys, moreover, consistently show that a majority of people are concerned about existing levels of inequality (even as they tend to underestimate how unequal their own societies are.) Why then, have the citizens of many western democracies been so reluctant to support policies and parties that are committed to addressing inequality? This paper, situated in the context of an especially small, distant and open economy, explores the discursive practices of business lobby groups and, more specifically, those groups’ arguments against higher wages at the lower end of the earnings distribution. It argues that these practices simultaneously rely on and reinforce a broad acceptance of the proposition that the social institution of the market is an immutable force that powerfully constrains society’s ability to pursue greater equalit
Exploring Multi-Modal Distributions with Nested Sampling
In performing a Bayesian analysis, two difficult problems often emerge.
First, in estimating the parameters of some model for the data, the resulting
posterior distribution may be multi-modal or exhibit pronounced (curving)
degeneracies. Secondly, in selecting between a set of competing models,
calculation of the Bayesian evidence for each model is computationally
expensive using existing methods such as thermodynamic integration. Nested
Sampling is a Monte Carlo method targeted at the efficient calculation of the
evidence, but also produces posterior inferences as a by-product and therefore
provides means to carry out parameter estimation as well as model selection.
The main challenge in implementing Nested Sampling is to sample from a
constrained probability distribution. One possible solution to this problem is
provided by the Galilean Monte Carlo (GMC) algorithm. We show results of
applying Nested Sampling with GMC to some problems which have proven very
difficult for standard Markov Chain Monte Carlo (MCMC) and down-hill methods,
due to the presence of large number of local minima and/or pronounced (curving)
degeneracies between the parameters. We also discuss the use of Nested Sampling
with GMC in Bayesian object detection problems, which are inherently
multi-modal and require the evaluation of Bayesian evidence for distinguishing
between true and spurious detections.Comment: Refereed conference proceeding, presented at 32nd International
Workshop on Bayesian Inference and Maximum Entropy Methods in Science and
Engineerin
Fiscal Transparency, Gubernatorial Popularity, and the Scale of Government: Evidence from the States.
We explore the effect of transparency of fiscal institutions on the scale of government and gubernatorial popularity using a formal model of accountability. We construct an index of fiscal transparency for the American states from detailed budgetary information. With cross-section data for 1986-1995, we find that - on average and controlling for other influential factors - fiscal transparency increases both the scale of government and gubernatorial popularity. The results, subjected to extensive robustness checks, imply that more transparent budget institutions induce greater effort by politicians, to which voters give higher job approval, on average. Voters also respond by entrusting greater resources to politicians where insittutions are more transparent, leading to larger size of government.
Entropic criterion for model selection
Model or variable selection is usually achieved through ranking models
according to the increasing order of preference. One of methods is applying
Kullback-Leibler distance or relative entropy as a selection criterion. Yet
that will raise two questions, why uses this criterion and are there any other
criteria. Besides, conventional approaches require a reference prior, which is
usually difficult to get. Following the logic of inductive inference proposed
by Caticha, we show relative entropy to be a unique criterion, which requires
no prior information and can be applied to different fields. We examine this
criterion by considering a physical problem, simple fluids, and results are
promising.Comment: 10 pages. Accepted for publication in Physica A, 200
Nested sampling for Potts models
Nested sampling is a new Monte Carlo method by Skilling [1] intended for general Bayesian computation. Nested sampling provides a robust alternative to annealing-based methods for computing normalizing constants. It can also generate estimates of other quantities such as posterior expectations. The key technical requirement is an ability to draw samples uniformly from the prior subject to a constraint on the likelihood. We provide a demonstration with the Potts model, an undirected graphical model
Acceleration of energetic particles by large-scale compressible magnetohydrodynamic turbulence
Fast particles diffusing along magnetic field lines in a turbulent plasma can
diffuse through and then return to the same eddy many times before the eddy is
randomized in the turbulent flow. This leads to an enhancement of particle
acceleration by large-scale compressible turbulence relative to previous
estimates in which isotropic particle diffusion is assumed.Comment: 13 pages, 3 figures, accepted for publication in Ap
Glitch or anti-glitch: a Bayesian view
The sudden spin-down in the rotation of magnetar 1E 2259+586 observed by Archibald et al.
(2013) was a rare event. However this particular event, referred to as an anti-glitch, was followed
by another event which Archibald et al. (2013) suggested could either be a conventional glitch or
another anti-glitch. Although there is no accompanied radiation activity or pulse profile change, there
is decisive evidence for the existence of the second timing event, judging from the timing data. We
apply Bayesian Model Selection to quantitatively determine which of these possibilities better explains
the observed data. We show that the observed data strongly supports the presence of two successive
anti-glitches with a Bayes Factor, often called the odds ratio, greater than 40. Furthermore, we show
that the second anti-glitch has an associated frequency change Δν of -8.2 X 10<sup>8</sup> Hz. We discuss the
implications of these results for possible physical mechanisms behind this anti-glitch
- …
