41 research outputs found

    Gibbs' paradox and black-hole entropy

    Full text link
    In statistical mechanics Gibbs' paradox is avoided if the particles of a gas are assumed to be indistinguishable. The resulting entropy then agrees with the empirically tested thermodynamic entropy up to a term proportional to the logarithm of the particle number. We discuss here how analogous situations arise in the statistical foundation of black-hole entropy. Depending on the underlying approach to quantum gravity, the fundamental objects to be counted have to be assumed indistinguishable or not in order to arrive at the Bekenstein--Hawking entropy. We also show that the logarithmic corrections to this entropy, including their signs, can be understood along the lines of standard statistical mechanics. We illustrate the general concepts within the area quantization model of Bekenstein and Mukhanov.Comment: Contribution to Mashhoon festschrift, 13 pages, 4 figure

    Quantum geometrodynamics: whence, whither?

    Full text link
    Quantum geometrodynamics is canonical quantum gravity with the three-metric as the configuration variable. Its central equation is the Wheeler--DeWitt equation. Here I give an overview of the status of this approach. The issues discussed include the problem of time, the relation to the covariant theory, the semiclassical approximation as well as applications to black holes and cosmology. I conclude that quantum geometrodynamics is still a viable approach and provides insights into both the conceptual and technical aspects of quantum gravity.Comment: 25 pages; invited contribution for the Proceedings of the seminar "Quantum Gravity: Challenges and Perspectives", Bad Honnef, Germany, April 200

    Quantum measurement as driven phase transition: An exactly solvable model

    Get PDF
    A model of quantum measurement is proposed, which aims to describe statistical mechanical aspects of this phenomenon, starting from a purely Hamiltonian formulation. The macroscopic measurement apparatus is modeled as an ideal Bose gas, the order parameter of which, that is, the amplitude of the condensate, is the pointer variable. It is shown that properties of irreversibility and ergodicity breaking, which are inherent in the model apparatus, ensure the appearance of definite results of the measurement, and provide a dynamical realization of wave-function reduction or collapse. The measurement process takes place in two steps: First, the reduction of the state of the tested system occurs over a time of order /(TN1/4)\hbar/(TN^{1/4}), where TT is the temperature of the apparatus, and NN is the number of its degrees of freedom. This decoherence process is governed by the apparatus-system interaction. During the second step classical correlations are established between the apparatus and the tested system over the much longer time-scale of equilibration of the apparatus. The influence of the parameters of the model on non-ideality of the measurement is discussed. Schr\"{o}dinger kittens, EPR setups and information transfer are analyzed.Comment: 35 pages revte

    Nernst Effect in Electron-Doped Pr2x_{2-x}Cex_{x}CuO4_4

    Full text link
    The Nernst effect of Pr2x_{2-x}Cex_{x}CuO4_4 (x=0.13, 0.15, and 0.17) has been measured on thin film samples between 5-120 K and 0-14 T. In comparison to recent measurements on hole-doped cuprates that showed an anomalously large Nernst effect above the resistive Tc_c and Hc2_{c2} \cite{xu,wang1,wang2,capan}, we find a normal Nernst effect above Tc_c and Hc2_{c2} for all dopings. The lack of an anomalous Nernst effect in the electron-doped compounds supports the models that explain this effect in terms of amplitude and phase fluctuations in the hole-doped cuprates. In addition, the Hc2_{c2}(T) determined from the Nernst effect shows a conventional behavior for all dopings. The energy gap determined from Hc2_{c2}(0) decreases as the system goes from under-doping to over-dopingin agreement with the recent tunnelling experiments

    Superconducting fluctuations and the Nernst effect: A diagrammatic approach

    Full text link
    We calculate the contribution of superconducting fluctuations above the critical temperature TcT_c to the transverse thermoelectric response αxy\alpha_{xy}, the quantity central to the analysis of the Nernst effect. The calculation is carried out within the microscopic picture of BCS, and to linear order in magnetic field. We find that as TTcT \to T_c, the dominant contribution to αxy\alpha_{xy} arises from the Aslamazov-Larkin diagrams, and is equal to the result previously obtained from a stochastic time-dependent Ginzburg-Landau equation [Ussishkin, Sondhi, and Huse, arXiv:cond-mat/0204484]. We present an argument which establishes this correspondence for the heat current. Other microscopic contributions, which generalize the Maki-Thompson and density of states terms for the conductivity, are less divergent as TTcT \to T_c.Comment: 11 pages, 5 figure

    Physics in the Real Universe: Time and Spacetime

    Get PDF
    The Block Universe idea, representing spacetime as a fixed whole, suggests the flow of time is an illusion: the entire universe just is, with no special meaning attached to the present time. This view is however based on time-reversible microphysical laws and does not represent macro-physical behaviour and the development of emergent complex systems, including life, which do indeed exist in the real universe. When these are taken into account, the unchanging block universe view of spacetime is best replaced by an evolving block universe which extends as time evolves, with the potential of the future continually becoming the certainty of the past. However this time evolution is not related to any preferred surfaces in spacetime; rather it is associated with the evolution of proper time along families of world linesComment: 28 pages, including 9 Figures. Major revision in response to referee comment

    Decoherence and wave function collapse

    Full text link
    The possibility of consistency between the basic quantum principles of quantum mechanics and wave function collapse is reexamined. A specific interpretation of environment is proposed for this aim and applied to decoherence. When the organization of a measuring apparatus is taken into account, this approach leads also to an interpretation of wave function collapse, which would result in principle from the same interactions with environment as decoherence. This proposal is shown consistent with the non-separable character of quantum mechanics

    On the EPR-type Entanglement in the Experiments of Scully et Al. I. The Micromaser Case and Delayed-choice Quantum Erasure

    Full text link
    Delayed-choice erasure is investigated in two-photon two-slit experiments that are generalizations of the micromaser experiment of Scully et al. [Scully, M. O. et al. Nature 351, 111-116 (1991)]. Applying quantum mechanics to the localization detector, it is shown that erasure with delayed choice in the sense of Scully, has an analogous structure as simple erasure. The description goes beyond probabilities. The EPR-type disentanglement, consisting in two mutually incompatible distant measurements, is used as a general framework in both parts of this study. Two simple coherence cases are shown to emerge naturally, and they are precisely the two experiments of Scully et al. The treatment seems to require the relative-reality-of-unitarily-evolving-state (RRUES) approach. Besides insight in the exoeriments, this study has also the goal of insight in quantum mechanics. The question is if it can be more than just a "book-keeping device" for calculating probabilities as Scully et al. modestly and cautiously claim.Comment: Latex2e, no figures, this manuscript is the first part of a study in two part

    The feasibility, proficiency, and mastery learning curves in 635 robotic pancreatoduodenectomies following a multicenter training program: "Standing on the Shoulders of Giants"

    Get PDF
    Objective: To assess the feasibility, proficiency, and mastery learning curves for robotic pancreatoduodenectomy (RPD) in "second-generation" RPD centers following a multicenter training program adhering to the IDEAL framework.Background: The long learning curves for RPD reported from "pioneering" expert centers may discourage centers interested in starting an RPD program. However, the feasibility, proficiency, and mastery learning curves may be shorter in "second-generation" centers that participated in dedicated RPD training programs, although data are lacking. We report on the learning curves for RPD in "second-generation" centers trained in a dedicated nationwide program.Methods: Post hoc analysis of all consecutive patients undergoing RPD in 7 centers that participated in the LAELAPS-3 training program, each with a minimum annual volume of 50 pancreatoduodenectomies, using the mandatory Dutch Pancreatic Cancer Audit (March 2016-December 2021). Cumulative sum analysis determined cutoffs for the 3 learning curves: operative time for the feasibility (1) risk-adjusted major complication (Clavien-Dindo grade >= III) for the proficiency, (2) and textbook outcome for the mastery, (3) learning curve. Outcomes before and after the cutoffs were compared for the proficiency and mastery learning curves. A survey was used to assess changes in practice and the most valued "lessons learned."Results: Overall, 635 RPD were performed by 17 trained surgeons, with a conversion rate of 6.6% (n=42). The median annual volume of RPD per center was 22.56.8. From 2016 to 2021, the nationwide annual use of RPD increased from 0% to 23% whereas the use of laparoscopic pancreatoduodenectomy decreased from 15% to 0%. The rate of major complications was 36.9% (n=234), surgical site infection 6.3% (n=40), postoperative pancreatic fistula (grade B/C) 26.9% (n=171), and 30-day/in-hospital mortality 3.5% (n=22). Cutoffs for the feasibility, proficiency, and mastery learning curves were reached at 15, 62, and 84 RPD. Major morbidity and 30-day/in-hospital mortality did not differ significantly before and after the cutoffs for the proficiency and mastery learning curves. Previous experience in laparoscopic pancreatoduodenectomy shortened the feasibility (-12 RPDs, -44%), proficiency (-32 RPDs, -34%), and mastery phase learning curve (-34 RPDs, -23%), but did not improve clinical outcome.Conclusions: The feasibility, proficiency, and mastery learning curves for RPD at 15, 62, and 84 procedures in "second-generation" centers after a multicenter training program were considerably shorter than previously reported from "pioneering" expert centers. The learning curve cutoffs and prior laparoscopic experience did not impact major morbidity and mortality. These findings demonstrate the safety and value of a nationwide training program for RPD in centers with sufficient volume.Surgical oncolog
    corecore