3,989 research outputs found

    Can the Arrow of Time be understood from Quantum Cosmology?

    Full text link
    I address the question whether the origin of the observed arrow of time can be derived from quantum cosmology. After a general discussion of entropy in cosmology and some numerical estimates, I give a brief introduction into quantum geometrodynamics and argue that this may provide a sufficient framework for studying this question. I then show that a natural boundary condition of low initial entropy can be imposed on the universal wave function. The arrow of time is then correlated with the size of the Universe and emerges from an increasing amount of decoherence due to entanglement with unobserved degrees of freedom. Remarks are also made concerning the arrow of time in multiverse pictures and scenarios motivated by dark energy.Comment: 14 pages, to appear in "The Arrow of Time", ed. by L. Mersini-Houghton and R. Vaa

    A Simulation Estimator for Testing the Time Homogeneity of Credit Rating Transition

    Get PDF
    The measurement of credit quality is at the heart of the models designed to assess the reserves and capital needed to support the risks of both individual credits and portfolios of credit instruments. A popular specification for credit- rating transitions is the simple, time-homogeneous Markov model. While the Markov specification cannot really describe processes in the long run, it may be useful for adequately describing short-run changes in portfolio risk. In this specification, the entire stochastic process can be characterized in terms of estimated transition probabilities. However, the simple homogeneous Markovian transition framework is restrictive. We propose a test of the null hypotheses of time-homogeneity that can be performed on the sorts of data often reported. We apply the tests to 4 data sets, on commercial paper, sovereign debt, municipal bonds and S&P Corporates. The results indicate that commercial paper looks Markovian on a 30-day time scale for up to 6 months; sovereign debt also looks Markovian (perhaps due to a small sample size); municipals are well-modeled by the Markov specification for up to 5 years, but could probably benefit from frequent updating of the estimated transition matrix or from more sophisticated modeling, and S&P Corporate ratings are approximately Markov over 3 transitions but not 4.

    Specification and Informational Issues in Credit Scoring

    Get PDF
    Lenders use rating and scoring models to rank credit applicants on their expected performance. The models and approaches are numerous. We explore the possibility that estimates generated by models developed with data drawn solely from extended loans are less valuable than they should be because of selectivity bias. We investigate the value of "reject inference"--methods that use a rejected applicant's characteristics, rather than loan performance data, in scoring model development. In the course of making this investigation, we also discuss the advantages of using parametric as well as nonparametric modeling. These issues are discussed and illustrated in the context of a simple stylized model.

    Gibbs' paradox and black-hole entropy

    Full text link
    In statistical mechanics Gibbs' paradox is avoided if the particles of a gas are assumed to be indistinguishable. The resulting entropy then agrees with the empirically tested thermodynamic entropy up to a term proportional to the logarithm of the particle number. We discuss here how analogous situations arise in the statistical foundation of black-hole entropy. Depending on the underlying approach to quantum gravity, the fundamental objects to be counted have to be assumed indistinguishable or not in order to arrive at the Bekenstein--Hawking entropy. We also show that the logarithmic corrections to this entropy, including their signs, can be understood along the lines of standard statistical mechanics. We illustrate the general concepts within the area quantization model of Bekenstein and Mukhanov.Comment: Contribution to Mashhoon festschrift, 13 pages, 4 figure

    Solving the Problem of Time in Mini-superspace: Measurement of Dirac Observables

    Full text link
    One solution to the so-called problem of time is to construct certain Dirac observables, sometimes called evolving constants of motion. There has been some discussion in the literature about the interpretation of such observables, and in particular whether single Dirac observables can be measured. Here we clarify the situation by describing a class of interactions that can be said to implement measurements of such observables. Along the way, we describe a useful notion of perturbation theory for the rigging map eta of group averaging (sometimes loosely called the physical state "projector"), which maps states from the auxiliary Hilbert space to the physical Hilbert space.Comment: 12 pages, ReVTe

    Decoherence in the cosmic background radiation

    Get PDF
    In this paper we analyze the possibility of detecting nontrivial quantum phenomena in observations of the temperature anisotropy of the cosmic background radiation (CBR), for example, if the Universe could be found in a coherent superposition of two states corresponding to different CBR temperatures. Such observations are sensitive to scalar primordial fluctuations but insensitive to tensor fluctuations, which are therefore converted into an environment for the former. Even for a free inflaton field minimally coupled to gravity, scalar-tensor interactions induce enough decoherence among histories of the scalar fluctuations as to render them classical under any realistic probe of their amplitudes.Comment: 15 pages, accepted to be published in Classical and Quantum Gravit

    Consistency of Semiclassical Gravity

    Get PDF
    We discuss some subtleties which arise in the semiclassical approximation to quantum gravity. We show that integrability conditions prevent the existence of Tomonaga-Schwinger time functions on the space of three-metrics but admit them on superspace. The concept of semiclassical time is carefully examined. We point out that central charges in the matter sector spoil the consistency of the semiclassical approximation unless the full quantum theory of gravity and matter is anomaly-free. We finally discuss consequences of these considerations for quantum field theory in flat spacetime, but with arbitrary foliations.Comment: 12 pages, LATEX, Report Freiburg THEP-94/2

    Quantization in black hole backgrounds

    Get PDF
    Quantum field theory in a semiclassical background can be derived as an approximation to quantum gravity from a weak-coupling expansion in the inverse Planck mass. Such an expansion is studied for evolution on "nice-slices" in the spacetime describing a black hole of mass M. Arguments for a breakdown of this expansion are presented, due to significant gravitational coupling between fluctuations, which is consistent with the statement that existing calculations of information loss in black holes are not reliable. For a given fluctuation, the coupling to subsequent fluctuations becomes of order unity by a time of order M^3. Lack of a systematic derivation of the weakly-coupled/semiclassical approximation would indicate a role for the non-perturbative dynamics of gravity, and possibly for the proposal that such dynamics has an essentially non-local quality.Comment: 28 pages, 4 figures, harvmac. v2: added refs, minor clarification

    Development and Validation of Credit-Scoring Models

    Get PDF
    Accurate credit-granting decisions are crucial to the efficiency of the decentralized capital allocation mechanisms in modern market economies. Credit bureaus and many .nancial institutions have developed and used credit-scoring models to standardize and automate, to the extent possible, credit decisions. We build credit scoring models for bankcard markets using the Office of the Comptroller of the Currency, Risk Analysis Division (OCC/RAD) consumer credit database (CCDB). This unusu- ally rich data set allows us to evaluate a number of methods in common practice. We introduce, estimate, and validate our models, using both out-of-sample contempora- neous and future validation data sets. Model performance is compared using both separation and accuracy measures. A vendor-developed generic bureau-based score is also included in the model performance comparisons. Our results indicate that current industry practices, when carefully applied, can produce models that robustly rank-order potential borrowers both at the time of development and through the near future. However, these same methodologies are likely to fail when the the objective is to accurately estimate future rates of delinquency or probabilities of default for individual or groups of borrowers.

    Symmetries,Singularities and the De-Emergence of Space

    Full text link
    Recent work has revealed intriguing connections between a Belinsky-Khalatnikov-Lifshitz-type analysis of spacelike singularities in General Relativity and certain infinite dimensional Lie algebras, and in particular the `maximally extended' hyperbolic Kac--Moody algebra E10. In this essay we argue that these results may lead to an entirely new understanding of the (quantum) nature of space(-time) at the Planck scale, and hence -- via an effective `de-emergence' of space near a singularity -- to a novel mechanism for achieving background independence in quantum gravity.Comment: 10 page
    • …
    corecore