14,757 research outputs found

    Entropy and temperature of black holes in a gravity's rainbow

    Get PDF
    The linear relation between the entropy and area of a black hole can be derived from the Heisenberg principle, the energy-momentum dispersion relation of special relativity, and general considerations about black holes. There exist results in quantum gravity and related contexts suggesting the modification of the usual dispersion relation and uncertainty principle. One of these contexts is the gravity's rainbow formalism. We analyze the consequences of such a modification for black hole thermodynamics from the perspective of two distinct rainbow realizations built from doubly special relativity. One is the proposal of Magueijo and Smolin and the other is based on a canonical implementation of doubly special relativity put forward recently by the authors. In these scenarios, we obtain modified expressions for the entropy and temperature of black holes. We show that, for a family of doubly special relativity theories satisfying certain properties, the temperature can vanish in the limit of zero black hole mass. For the Magueijo and Smolin proposal, this is only possible for some restricted class of models with bounded energy and unbounded momentum. With the proposal of a canonical implementation, on the other hand, the temperature may vanish for more general theories; in particular, the momentum may also be bounded, with bounded or unbounded energy. This opens new possibilities for the outcome of black hole evaporation in the framework of a gravity's rainbow.Comment: 11 pages, 2 new references added, version accepted for publication in Physical Review

    Nonmonotonic Probabilistic Logics between Model-Theoretic Probabilistic Logic and Probabilistic Logic under Coherence

    Full text link
    Recently, it has been shown that probabilistic entailment under coherence is weaker than model-theoretic probabilistic entailment. Moreover, probabilistic entailment under coherence is a generalization of default entailment in System P. In this paper, we continue this line of research by presenting probabilistic generalizations of more sophisticated notions of classical default entailment that lie between model-theoretic probabilistic entailment and probabilistic entailment under coherence. That is, the new formalisms properly generalize their counterparts in classical default reasoning, they are weaker than model-theoretic probabilistic entailment, and they are stronger than probabilistic entailment under coherence. The new formalisms are useful especially for handling probabilistic inconsistencies related to conditioning on zero events. They can also be applied for probabilistic belief revision. More generally, in the same spirit as a similar previous paper, this paper sheds light on exciting new formalisms for probabilistic reasoning beyond the well-known standard ones.Comment: 10 pages; in Proceedings of the 9th International Workshop on Non-Monotonic Reasoning (NMR-2002), Special Session on Uncertainty Frameworks in Nonmonotonic Reasoning, pages 265-274, Toulouse, France, April 200

    Building a case for a Planck-scale-deformed boost action: the Planck-scale particle-localization limit

    Full text link
    "Doubly-special relativity" (DSR), the idea of a Planck-scale Minkowski limit that is still a relativistic theory, but with both the Planck scale and the speed-of-light scale as nontrivial relativistic invariants, was proposed (gr-qc/0012051) as a physics intuition for several scenarios which may arise in the study of the quantum-gravity problem, but most DSR studies focused exclusively on the search of formalisms for the description of a specific example of such a Minkowski limit. A novel contribution to the DSR physics intuition came from a recent paper by Smolin (hep-th/0501091) suggesting that the emergence of the Planck scale as a second nontrivial relativistic invariant might be inevitable in quantum gravity, relying only on some rather robust expectations concerning the semiclassical approximation of quantum gravity. I here attempt to strengthen Smolin's argument by observing that an analysis of some independently-proposed Planck-scale particle-localization limits, such as the "Generalized Uncertainty Principle" often attributed to string theory in the literature, also suggests that the emergence of a DSR Minkowski limit might be inevitable. I discuss a possible link between this observation and recent results on logarithmic corrections to the entropy-area black-hole formula, and I observe that both the analysis here reported and Smolin's analysis appear to suggest that the examples of DSR Minkowski limits for which a formalism has been sought in the literature might not be sufficiently general. I also stress that, as we now contemplate the hypothesis of a DSR Minkowski limit, there is an additional challenge for those in the quantum-gravity community attributing to the Planck length the role of "fundamental length scale".Comment: 12 pages, LaTe

    The Uncertainty Relation in "Which-Way" Experiments: How to Observe Directly the Momentum Transfer using Weak Values

    Full text link
    A which-way measurement destroys the twin-slit interference pattern. Bohr argued that distinguishing between two slits a distance s apart gives the particle a random momentum transfer \wp of order h/s. This was accepted for more than 60 years, until Scully, Englert and Walther (SEW) proposed a which-way scheme that, they claimed, entailed no momentum transfer. Storey, Tan, Collett and Walls (STCW) in turn proved a theorem that, they claimed, showed that Bohr was right. This work reviews and extends a recent proposal [Wiseman, Phys. Lett. A 311, 285 (2003)] to resolve the issue using a weak-valued probability distribution for momentum transfer, P_wv(\wp). We show that P_wv(\wp) must be wider than h/6s. However, its moments can still be zero because P_wv(\wp) is not necessarily positive definite. Nevertheless, it is measurable in a way understandable to a classical physicist. We introduce a new measure of spread for P_wv(\wp): half of the unit-confidence interval, and conjecture that it is never less than h/4s. For an idealized example with infinitely narrow slits, the moments of P_wv(\wp) and of the momentum distributions are undefined unless a process of apodization is used. We show that by considering successively smoother initial wave functions, successively more moments of both P_wv(\wp) and the momentum distributions become defined. For this example the moments of P_wv(\wp) are zero, and these are equal to the changes in the moments of the momentum distribution. We prove that this relation holds for schemes in which the moments of P_wv(\wp) are non-zero, but only for the first two moments. We also compare these moments to those of two other momentum-transfer distributions and \hat{p}_f-\hat{p}_i. We find agreement between all of these, but again only for the first two moments.Comment: 14 pages, 6 figures, submitted to J. Opt.

    Effects due to a scalar coupling on the particle-antiparticle production in the Duffin-Kemmer-Petiau theory

    Full text link
    The Duffin-Kemmer-Petiau formalism with vector and scalar potentials is used to point out a few misconceptions diffused in the literature. It is explicitly shown that the scalar coupling makes the DKP formalism not equivalent to the Klein-Gordon formalism or to the Proca formalism, and that the spin-1 sector of the DKP theory looks formally like the spin-0 sector. With proper boundary conditions, scattering of massive bosons in an arbitrary mixed vector-scalar square step potential is explored in a simple way and effects due to the scalar coupling on the particle-antiparticle production and localization of bosons are analyzed in some detail

    The theory and phenomenology of perturbative QCD based jet quenching

    Full text link
    The study of the structure of strongly interacting dense matter via hard jets is reviewed. High momentum partons produced in hard collisions produce a shower of gluons prior to undergoing the non-perturbative process of hadronization. In the presence of a dense medium this shower is modified due to scattering of the various partons off the constituents in the medium. The modified pattern of the final detected hadrons is then a probe of the structure of the medium as perceived by the jet. Starting from the factorization paradigm developed for the case of particle collisions, we review the basic underlying theory of medium induced gluon radiation based on perturbative Quantum Chromo Dynamics (pQCD) and current experimental results from Deep Inelastic Scattering on large nuclei and high energy heavy-ion collisions, emphasizing how these results constrain our understanding of energy loss. This review contains introductions to the theory of radiative energy loss, elastic energy loss, and the corresponding experimental observables and issues. We close with a discussion of important calculations and measurements that need to be carried out to complete the description of jet modification at high energies at future high energy colliders.Comment: 78 pages, 24 figures, submitted to prog. part. nucl. phy

    Bounded Rationality and Heuristics in Humans and in Artificial Cognitive Systems

    Get PDF
    In this paper I will present an analysis of the impact that the notion of “bounded rationality”, introduced by Herbert Simon in his book “Administrative Behavior”, produced in the field of Artificial Intelligence (AI). In particular, by focusing on the field of Automated Decision Making (ADM), I will show how the introduction of the cognitive dimension into the study of choice of a rational (natural) agent, indirectly determined - in the AI field - the development of a line of research aiming at the realisation of artificial systems whose decisions are based on the adoption of powerful shortcut strategies (known as heuristics) based on “satisficing” - i.e. non optimal - solutions to problem solving. I will show how the “heuristic approach” to problem solving allowed, in AI, to face problems of combinatorial complexity in real-life situations and still represents an important strategy for the design and implementation of intelligent systems
    • …
    corecore