23,037 research outputs found

    A Computable Economist’s Perspective on Computational Complexity

    Get PDF
    A computable economist's view of the world of computational complexity theory is described. This means the model of computation underpinning theories of computational complexity plays a central role. The emergence of computational complexity theories from diverse traditions is emphasised. The unifications that emerged in the modern era was codified by means of the notions of efficiency of computations, non-deterministic computations, completeness, reducibility and verifiability - all three of the latter concepts had their origins on what may be called 'Post's Program of Research for Higher Recursion Theory'. Approximations, computations and constructions are also emphasised. The recent real model of computation as a basis for studying computational complexity in the domain of the reals is also presented and discussed, albeit critically. A brief sceptical section on algorithmic complexity theory is included in an appendix

    Thimble regularization at work: from toy models to chiral random matrix theories

    Get PDF
    We apply the Lefschetz thimble formulation of field theories to a couple of different problems. We first address the solution of a complex 0-dimensional phi^4 theory. Although very simple, this toy-model makes us appreciate a few key issues of the method. In particular, we will solve the model by a correct accounting of all the thimbles giving a contribution to the partition function and we will discuss a number of algorithmic solutions to simulate this (simple) model. We will then move to a chiral random matrix (CRM) theory. This is a somehow more realistic setting, giving us once again the chance to tackle the same couple of fundamental questions: how many thimbles contribute to the solution? how can we make sure that we correctly sample configurations on the thimble? Since the exact result is known for the observable we study (a condensate), we can verify that, in the region of parameters we studied, only one thimble contributes and that the algorithmic solution that we set up works well, despite its very crude nature. The deviation of results from phase quenched results highlights that in a certain region of parameter space there is a quite important sign problem. In view of this, the success of our thimble approach is quite a significant one.Comment: 33 pages, 8 figures. Some extra references have been added and subsection 3.1 has been substantially expanded. Some extra comments on numerics have also been added in subsection 4.4. Appendix A and appendix B.1 now features some more detail

    Generalized Sums over Histories for Quantum Gravity I. Smooth Conifolds

    Get PDF
    This paper proposes to generalize the histories included in Euclidean functional integrals from manifolds to a more general set of compact topological spaces. This new set of spaces, called conifolds, includes nonmanifold stationary points that arise naturally in a semiclasssical evaluation of such integrals; additionally, it can be proven that sequences of approximately Einstein manifolds and sequences of approximately Einstein conifolds both converge to Einstein conifolds. Consequently, generalized Euclidean functional integrals based on these conifold histories yield semiclassical amplitudes for sequences of both manifold and conifold histories that approach a stationary point of the Einstein action. Therefore sums over conifold histories provide a useful and self-consistent starting point for further study of topological effects in quantum gravity. Postscript figures available via anonymous ftp at black-hole.physics.ubc.ca (137.82.43.40) in file gen1.ps.Comment: 81pp., plain TeX, To appear in Nucl. Phys.

    Algorithmic Complexity in Cosmology and Quantum Gravity

    Get PDF
    In this article we use the idea of algorithmic complexity (AC) to study various cosmological scenarios, and as a means of quantizing the gravitational interaction. We look at 5D and 7D cosmological models where the Universe begins as a higher dimensional Planck size spacetime which fluctuates between Euclidean and Lorentzian signatures. These fluctuations are governed by the AC of the two different signatures. At some point a transition to a 4D Lorentzian signature Universe occurs, with the extra dimensions becoming ``frozen'' or non-dynamical. We also apply the idea of algorithmic complexity to study composite wormholes, the entropy of blackholes, and the path integral for quantum gravity.Comment: 15 page
    • …
    corecore