230,172 research outputs found

    Statistical analysis of entropy correction from topological defects in Loop Black Holes

    Full text link
    In this paper we discuss the entropy of quantum black holes in the LQG formalism when the number of punctures on the horizon is treated as a quantum hair, that is we compute the black hole entropy in the grand canonical (area) ensemble. The entropy is a function of both the average area and the average number of punctures and bears little resemblance to the Bekenstein-Hawking entropy. In the thermodynamic limit, both the "temperature" and the chemical potential can be shown to be functions only of the average area per puncture. At a fixed temperature, the average number of punctures becomes proportional to the average area and we recover the Bekenstein-Hawking area-entropy law to leading order provided that the Barbero-Immirzi parameter, γ\gamma, is appropriately fixed. This also relates the chemical potential to γ\gamma. We obtain a sub-leading correction, which differs in signature from that obtained in the microcanonical and canonical ensembles in its sign but agrees with earlier results in the grand canonical ensemble.Comment: 12 pages, no figures. Version to appear in Phys. Rev.

    On the average uncertainty for systems with nonlinear coupling

    Full text link
    The increased uncertainty and complexity of nonlinear systems have motivated investigators to consider generalized approaches to defining an entropy function. New insights are achieved by defining the average uncertainty in the probability domain as a transformation of entropy functions. The Shannon entropy when transformed to the probability domain is the weighted geometric mean of the probabilities. For the exponential and Gaussian distributions, we show that the weighted geometric mean of the distribution is equal to the density of the distribution at the location plus the scale, i.e. at the width of the distribution. The average uncertainty is generalized via the weighted generalized mean, in which the moment is a function of the nonlinear source. Both the Renyi and Tsallis entropies transform to this definition of the generalized average uncertainty in the probability domain. For the generalized Pareto and Student's t-distributions, which are the maximum entropy distributions for these generalized entropies, the appropriate weighted generalized mean also equals the density of the distribution at the location plus scale. A coupled entropy function is proposed, which is equal to the normalized Tsallis entropy divided by one plus the coupling.Comment: 24 pages, including 4 figures and 1 tabl

    The entropy in finite NN-unit nonextensive systems: the ordinary average and qq-average

    Full text link
    We have discussed the Tsallis entropy in finite NN-unit nonextensive systems, by using the multivariate qq-Gaussian probability distribution functions (PDFs) derived by the maximum entropy methods with the normal average and the qq-average (qq: the entropic index). The Tsallis entropy obtained by the qq-average has an exponential NN dependence: Sq(N)/N≃ e(1−q)N S1(1)S_q^{(N)}/N \simeq \:e^{(1-q)N \:S_1^{(1)}} for large NN (≫1(1−q)>0\gg \frac{1}{(1-q)} >0). In contrast, the Tsallis entropy obtained by the normal average is given by Sq(N)/N≃[1/(q−1)N]S_q^{(N)}/N \simeq [1/(q-1)N] for large NN (≫1(q−1)>0)\gg \frac{1}{(q-1)} > 0). NN dependences of the Tsallis entropy obtained by the qq- and normal averages are generally quite different, although the both results are in fairly good agreement for ∣q−1∣≪1.0\vert q-1 \vert \ll 1.0. The validity of the factorization approximation to PDFs which has been commonly adopted in the literature, has been examined. We have calculated correlations defined by Cm=⟨(δxi δxj)m⟩−⟨(δxi)m⟩ ⟨(δxj)m⟩C_m= \langle (\delta x_i \:\delta x_j)^m \rangle -\langle (\delta x_i)^m \rangle\: \langle (\delta x_j)^m \rangle for i≠ji \neq j where δxi=xi−⟨xi⟩\delta x_i=x_i -\langle x_i \rangle, and the bracket ⟨⋅⟩\langle \cdot \rangle stands for the normal and qq-averages. The first-order correlation (m=1m=1) expresses the intrinsic correlation and higher-order correlations with m≥2m \geq 2 include nonextensivity-induced correlation, whose physical origin is elucidated in the superstatistics.Comment: 23 pages, 5 figures: the final version accepted in J. Math. Phy

    Integral Fluctuation Relations for Entropy Production at Stopping Times

    Full text link
    A stopping time TT is the first time when a trajectory of a stochastic process satisfies a specific criterion. In this paper, we use martingale theory to derive the integral fluctuation relation ⟨e−Stot(T)⟩=1\langle e^{-S_{\rm tot}(T)}\rangle=1 for the stochastic entropy production StotS_{\rm tot} in a stationary physical system at stochastic stopping times TT. This fluctuation relation implies the law ⟨Stot(T)⟩≥0\langle S_{\rm tot}(T)\rangle\geq 0, which states that it is not possible to reduce entropy on average, even by stopping a stochastic process at a stopping time, and which we call the second law of thermodynamics at stopping times. This law implies bounds on the average amount of heat and work a system can extract from its environment when stopped at a random time. Furthermore, the integral fluctuation relation implies that certain fluctuations of entropy production are universal or are bounded by universal functions. These universal properties descend from the integral fluctuation relation by selecting appropriate stopping times: for example, when TT is a first-passage time for entropy production, then we obtain a bound on the statistics of negative records of entropy production. We illustrate these results on simple models of nonequilibrium systems described by Langevin equations and reveal two interesting phenomena. First, we demonstrate that isothermal mesoscopic systems can extract on average heat from their environment when stopped at a cleverly chosen moment and the second law at stopping times provides a bound on the average extracted heat. Second, we demonstrate that the average efficiency at stopping times of an autonomous stochastic heat engines, such as Feymann's ratchet, can be larger than the Carnot efficiency and the second law of thermodynamics at stopping times provides a bound on the average efficiency at stopping times.Comment: 37 pages, 6 figure

    Typicality versus thermality: An analytic distinction

    Full text link
    In systems with a large degeneracy of states such as black holes, one expects that the average value of probe correlation functions will be well approximated by the thermal ensemble. To understand how correlation functions in individual microstates differ from the canonical ensemble average and from each other, we study the variances in correlators. Using general statistical considerations, we show that the variance between microstates will be exponentially suppressed in the entropy. However, by exploiting the analytic properties of correlation functions we argue that these variances are amplified in imaginary time, thereby distinguishing pure states from the thermal density matrix. We demonstrate our general results in specific examples and argue that our results apply to the microstates of black holes.Comment: 22 pages + appendices, 3 eps figure
    • …
    corecore