230,172 research outputs found
Statistical analysis of entropy correction from topological defects in Loop Black Holes
In this paper we discuss the entropy of quantum black holes in the LQG
formalism when the number of punctures on the horizon is treated as a quantum
hair, that is we compute the black hole entropy in the grand canonical (area)
ensemble. The entropy is a function of both the average area and the average
number of punctures and bears little resemblance to the Bekenstein-Hawking
entropy. In the thermodynamic limit, both the "temperature" and the chemical
potential can be shown to be functions only of the average area per puncture.
At a fixed temperature, the average number of punctures becomes proportional to
the average area and we recover the Bekenstein-Hawking area-entropy law to
leading order provided that the Barbero-Immirzi parameter, , is
appropriately fixed. This also relates the chemical potential to . We
obtain a sub-leading correction, which differs in signature from that obtained
in the microcanonical and canonical ensembles in its sign but agrees with
earlier results in the grand canonical ensemble.Comment: 12 pages, no figures. Version to appear in Phys. Rev.
On the average uncertainty for systems with nonlinear coupling
The increased uncertainty and complexity of nonlinear systems have motivated
investigators to consider generalized approaches to defining an entropy
function. New insights are achieved by defining the average uncertainty in the
probability domain as a transformation of entropy functions. The Shannon
entropy when transformed to the probability domain is the weighted geometric
mean of the probabilities. For the exponential and Gaussian distributions, we
show that the weighted geometric mean of the distribution is equal to the
density of the distribution at the location plus the scale, i.e. at the width
of the distribution. The average uncertainty is generalized via the weighted
generalized mean, in which the moment is a function of the nonlinear source.
Both the Renyi and Tsallis entropies transform to this definition of the
generalized average uncertainty in the probability domain. For the generalized
Pareto and Student's t-distributions, which are the maximum entropy
distributions for these generalized entropies, the appropriate weighted
generalized mean also equals the density of the distribution at the location
plus scale. A coupled entropy function is proposed, which is equal to the
normalized Tsallis entropy divided by one plus the coupling.Comment: 24 pages, including 4 figures and 1 tabl
The entropy in finite -unit nonextensive systems: the ordinary average and -average
We have discussed the Tsallis entropy in finite -unit nonextensive
systems, by using the multivariate -Gaussian probability distribution
functions (PDFs) derived by the maximum entropy methods with the normal average
and the -average (: the entropic index). The Tsallis entropy obtained by
the -average has an exponential dependence: for large (). In
contrast, the Tsallis entropy obtained by the normal average is given by
for large (.
dependences of the Tsallis entropy obtained by the - and normal averages are
generally quite different, although the both results are in fairly good
agreement for . The validity of the factorization
approximation to PDFs which has been commonly adopted in the literature, has
been examined. We have calculated correlations defined by for where , and
the bracket stands for the normal and -averages. The
first-order correlation () expresses the intrinsic correlation and
higher-order correlations with include nonextensivity-induced
correlation, whose physical origin is elucidated in the superstatistics.Comment: 23 pages, 5 figures: the final version accepted in J. Math. Phy
Integral Fluctuation Relations for Entropy Production at Stopping Times
A stopping time is the first time when a trajectory of a stochastic
process satisfies a specific criterion. In this paper, we use martingale theory
to derive the integral fluctuation relation for the stochastic entropy production in a
stationary physical system at stochastic stopping times . This fluctuation
relation implies the law , which states
that it is not possible to reduce entropy on average, even by stopping a
stochastic process at a stopping time, and which we call the second law of
thermodynamics at stopping times. This law implies bounds on the average amount
of heat and work a system can extract from its environment when stopped at a
random time. Furthermore, the integral fluctuation relation implies that
certain fluctuations of entropy production are universal or are bounded by
universal functions. These universal properties descend from the integral
fluctuation relation by selecting appropriate stopping times: for example, when
is a first-passage time for entropy production, then we obtain a bound on
the statistics of negative records of entropy production. We illustrate these
results on simple models of nonequilibrium systems described by Langevin
equations and reveal two interesting phenomena. First, we demonstrate that
isothermal mesoscopic systems can extract on average heat from their
environment when stopped at a cleverly chosen moment and the second law at
stopping times provides a bound on the average extracted heat. Second, we
demonstrate that the average efficiency at stopping times of an autonomous
stochastic heat engines, such as Feymann's ratchet, can be larger than the
Carnot efficiency and the second law of thermodynamics at stopping times
provides a bound on the average efficiency at stopping times.Comment: 37 pages, 6 figure
Typicality versus thermality: An analytic distinction
In systems with a large degeneracy of states such as black holes, one expects
that the average value of probe correlation functions will be well approximated
by the thermal ensemble. To understand how correlation functions in individual
microstates differ from the canonical ensemble average and from each other, we
study the variances in correlators. Using general statistical considerations,
we show that the variance between microstates will be exponentially suppressed
in the entropy. However, by exploiting the analytic properties of correlation
functions we argue that these variances are amplified in imaginary time,
thereby distinguishing pure states from the thermal density matrix. We
demonstrate our general results in specific examples and argue that our results
apply to the microstates of black holes.Comment: 22 pages + appendices, 3 eps figure
- …